TY - JOUR
T1 - Identifying the ⇜right⇝ level of explanation in a given situation
AU - Beaudouin, Valérie
AU - Bloch, Isabelle
AU - Bounie, David
AU - Clémençon, Stéphan
AU - d'Alché-Buc, Florence
AU - Eagan, James
AU - Maxwell, Winston
AU - Mozharovskyi, Pavlo
AU - Parekh, Jayneel
N1 - Publisher Copyright:
© 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
PY - 2020/1/1
Y1 - 2020/1/1
N2 - We present a framework for defining the “right” level of explainability based on technical, legal and economic considerations. Our approach involves three logical steps: First, define the main contextual factors, such as who is the audience of the explanation, the operational context, the level of harm that the system could cause, and the legal/regulatory framework. This step will help characterize the operational and legal needs for explanation, and the corresponding social benefits. Second, examine the technical tools available, including post-hoc approaches (input perturbation, saliency maps...) and hybrid AI approaches. Third, as function of the first two steps, choose the right levels of global and local explanation outputs, taking into the account the costs involved. We identify seven kinds of costs and emphasize that explanations are socially useful only when total social benefits exceed costs.
AB - We present a framework for defining the “right” level of explainability based on technical, legal and economic considerations. Our approach involves three logical steps: First, define the main contextual factors, such as who is the audience of the explanation, the operational context, the level of harm that the system could cause, and the legal/regulatory framework. This step will help characterize the operational and legal needs for explanation, and the corresponding social benefits. Second, examine the technical tools available, including post-hoc approaches (input perturbation, saliency maps...) and hybrid AI approaches. Third, as function of the first two steps, choose the right levels of global and local explanation outputs, taking into the account the costs involved. We identify seven kinds of costs and emphasize that explanations are socially useful only when total social benefits exceed costs.
M3 - Conference article
AN - SCOPUS:85090895331
SN - 1613-0073
VL - 2659
SP - 63
EP - 66
JO - CEUR Workshop Proceedings
JF - CEUR Workshop Proceedings
T2 - 1st International Workshop on New Foundations for Human-Centered AI, NeHuAI 2020
Y2 - 4 September 2020
ER -