A Systematic Method to Understand Requirements for Explainable AI (XAI) Systems

Abstract This paper presents a five-step systematic method in the development of an explainable AI (XAI) system, to (i) understand specific explanation requirements, (ii) assess existing explanation capabilities and (iii) steer future research and development in this area. A case study is discussed whereby the method was developed and applied within an industrial context. This paper is a summary of research originally published at the XAI workshop at IJCAI, 2019.
Authors
  • Mark Hall (Airbus)
  • Dan Harborne (Cardiff)
  • Richard Tomsett (IBM UK)
  • Vedran Galetic (Airbus)
  • Santiago Quintana (Airbus)
  • Alistair Nottle (Airbus)
Date Sep-2019
Venue Annual Fall Meeting of the DAIS ITA, 2019
Variants