A Systematic Method to Understand Requirements for Explainable AI (XAI) Systems

Abstract This paper presents a five-step systematic method in the development of an explainable AI (XAI) system, to (i) understand specific explanation requirements, (ii) assess existing explanation capabilities and (iii) steer future research and development in this area. A case study is discussed whereby the method was developed and applied within an industrial context.
Authors
  • Mark Hall (Airbus)
  • Dan Harborne (Cardiff)
  • Richard Tomsett (IBM UK)
  • Vedran Galetic (Airbus)
  • Santiago Quintana (Airbus)
  • Alistair Nottle (Airbus)
  • Alun Preece (Cardiff)
Date Aug-2019
Venue IJCAI 2019 Workshop on Explainable Artificial Intelligence (XAI) [link]
Variants