Self-Explaining Adaptive Software

Lead-Supervisor:

How to explain context-dependent adaptation decisions? How can we generate explanations that really help operators in the current situation?

Adaptive software adapts its behaviour to dynamic changes in the execution context to satisfy the (possibly also context-dependent) control objectives as good as possible. In such a complex setting, the resulting software behaviour and the effect of single design decisions is hard to predict. Furthermore, if the behaviour of the software changes at run-time, operators need to understand the reasons and effects of this in order to be able to predict the future performance and to intervene where necessary. In this project, we aim to support designers of adaptive systems by explaining the effect of design decisions in an interactive what-if analysis and operators by highlighting the reasons and expected consequences of adaptation decisions. To reduce the mental load of operators, we want to tailor explanations to situation characteristics and knowledge of addressees in order to provide explanations that focus on missing information.

Desirable background and expertise: knowledge in one or more of the following areas: probabilistic modelling, modelling with transition systems (e.g. finite automata, timed automata, statecharts, ..), BDA agents