This year’s Explanation-aware Computing Workshop ExaCt 2010 takes place in Lisbon, Portugal, as part of the 19th European Conference on Artificial Intelligence ECAI 2010. The event is organised again by me and my colleagues and friends: David B. Leake, Nava Tintarev, and Daniel Bahls.
Here is the main part of the call for papers:
Both within AI systems and in interactive systems, the ability to explain reasoning processes and results can substantially affect system usability. For example, in recommender systems good explanations may help to inspire user trust and loyalty, increase satisfaction, make it quicker and easier for users to find what they want, and persuade them to try or buy a recommended item.
Current interest in mixed-initiative systems provides a new context in which explanation issues may play a crucial role. When knowledge-based systems are partners in an interactive socio-technical process, with incomplete and changing problem descriptions, communication between human and software systems is a central part. Explanations exchanged between human agents and software agents may play an important role in mixed-initiative problem solving.
Other disciplines such as cognitive science, linguistics, philosophy of science, psychology, and education have investigated explanation as well. They consider varying aspects, making it clear that there are many different views of the nature of explanation and facets of explanation to explore. Within the field of knowledge-based systems, explanations have been considered as an important link between humans and machines. There, their main purpose has been to increase the confidence of the user in the system’s result, by providing evidence of how it was derived. Additional AI research has focused on how computer systems can themselves use explanations, for example to guide learning.
This workshop series aims to draw on multiple perspectives on explanation, to examine how explanation can be applied to further the development of robust and dependable systems and to illuminate system processes to increase user acceptance and feeling of control.
Suggested topics for contributions (not restricted to IT views):
- Models for explanations / explanation representation languages
- Integrating application and explanation knowledge
- Explanation-awareness in applications and software design
- Methodologies for developing explanation-aware systems
- Learning to explain
- Context-aware explanation vs. explanation-aware context
- Confidence and explanations
- Security, trust, and explanation
- Requirements and needs for explanations to support human understanding
- Explanation of complex, autonomous systems
- Co-operative explanation
- Explanation exchange among intelligent agents
- Explanation visualisation
- Workshop paper submission deadline: May 7, 2010
- Notification of workshop paper acceptance: June 7, 2010
- Camera-ready copy submission: June 20, 2010
- Workshop (two days): August 16-17, 2010
Read the complete call for papers on the workshop website.