How can digital systems explain their behavior both precisely and comprehensibly? This question lies at the heart of CAUSE – Concepts and Algorithms for, and Usage of Self-Explaining Digitally Controlled Systems – a graduate school funded by the DFG since November 2024, involving three institutions: Hamburg University of Technology (TUHH), the University of Bremen, and the University of Oldenburg.
Virtually every technical system today is digitally controlled, enhancing its intelligence. However, their functioning often remains opaque to developers, users, and cooperating systems, which can lead to misunderstandings, inefficiencies, or even malfunctions. CAUSE addresses these issues by working on making digitally controlled systems self-explaining for developers, users, and other systems.
Since November 2024, researchers have been collaborating within CAUSE, jointly exploring self-explanation through the example of energy networks and wind farms.
CAUSE was publicly launched with a symposium at Hamburg University of Technology. In his welcome speech, TUHH President Andreas Timm-Giel emphasized the strategic importance of CAUSE for strengthening regional collaboration among partner institutions and for advancing technically grounded approaches to system transparency. Prof. Görschwin Fey, CAUSE’s spokesperson, and Prof. Martin Fränzle, co-spokesperson from Oldenburg, outlined the graduate school’s vision: developing algorithmic principles, formal methods, and engineering techniques for systems capable of providing meaningful, context-aware explanations of their own behavior.
The morning keynotes set the tone for the day’s technical depth. Prof. Erika Abraham (RWTH Aachen) opened with a talk on formal verification methods as a foundation for self-explaining control systems, highlighting the role of logical structures for traceability and trust. She was followed by Andreas Sommer (WEINMANN Emergency Medical Technology), who presented an industry perspective on how explainability in system design can support the development of medical devices.
Subsequent presentations showcased current research directions within CAUSE. Talks ranged from foundational challenges in cyber-physical systems (Prof. Heiko Falk), receiver-specific explanation models (Prof. Verena Klös), and algorithmic decision procedures (Moritz Buhr), to executable explanations (Ulrike Engeln). Contributions by Ivo David Oliveira and Caroline Dominik illustrated how context-sensitive self-explanation can be integrated into adaptive software and virtual prototyping.
In the concluding poster session, all CAUSE doctoral researchers presented their projects, highlighting the thematic diversity and ongoing interdisciplinary collaborations. The session provided space for in-depth technical discussion and initiated many cross-location conversations that will shape the future development of the program.