Abstract
Robo-advisors are democratizing access to life-insurance by enabling fully online underwriting. In Europe, financial legislation requires that the reasons for recommending a life insurance plan be explained according to the characteristics of the client, in order to empower the client to make a "fully informed decision". In this study conducted in France, we seek to understand whether legal requirements for feature-based explanations actually help users in their decision-making. We conduct a qualitative study to characterize the explainability needs formulated by non-expert users and by regulators expert in customer protection. We then run a large-scale quantitative study using Robex, a simplified robo-advisor built using ecological interface design that delivers recommendations with explanations in different hybrid textual and visual formats: either "dialogic"- more textual - or "graphical"- more visual. We find that providing feature-based explanations does not improve appropriate reliance or understanding compared to not providing any explanation. In addition, dialogic explanations increase users' trust in the recommendations of the robo-advisor, sometimes to the users' detriment. This real-world scenario illustrates how XAI can address information asymmetry in complex areas such as finance. This work has implications for other critical, AI-based recommender systems, where the General Data Protection Regulation (GDPR) may require similar provisions for feature-based explanations.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023 |
| Publisher | Association for Computing Machinery |
| Pages | 943-958 |
| Number of pages | 16 |
| ISBN (Electronic) | 9781450372527 |
| DOIs | |
| Publication status | Published - 12 Jun 2023 |
| Event | 6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023 - Chicago, United States Duration: 12 Jun 2023 → 15 Jun 2023 |
Publication series
| Name | ACM International Conference Proceeding Series |
|---|
Conference
| Conference | 6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023 |
|---|---|
| Country/Territory | United States |
| City | Chicago |
| Period | 12/06/23 → 15/06/23 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 8 Decent Work and Economic Growth
Keywords
- AI regulation
- explainability
- financial inclusion
- intelligibility
Fingerprint
Dive into the research topics of 'Questioning the ability of feature-based explanations to empower non-experts in robo-advised financial decision-making'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver