University of Twente Student Theses

Login

Measuring the effect of non-expert language on explanation satisfaction and user trust in Conversational XAI systems

Overeem, J. (2024) Measuring the effect of non-expert language on explanation satisfaction and user trust in Conversational XAI systems.

[img] PDF
4MB
Abstract:The rise of complex machine learning algorithms increases the need for Explainable AI (XAI) systems to improve the interpretability of their behavior and decision-making, with the purpose of bettering user's trust and reliance on the system. Traditionally, XAI systems are made by experts for experts, even though it is often the case that the people who would benefit from these systems (non-experts) are not proficient in data science, and therefore have difficulty understanding existing XAI explanations. Combining the interpretability of natural language used by conversational agents with the insights of XAI systems leads to conversational XAI systems whose explanations are more accessible for non-expert users. This study aimed to find a way to make conversational XAI systems more accessible to non-expert users through trust and interpretability. The results show no significant difference between the two versions for these metrics. Further research should include more usability studies with diverse datasets and scenarios for generalizability, and explore different theoretically backed methods to adjust language and explanation complexity for more definitive results.
Item Type:Essay (Master)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Interaction Technology MSc (60030)
Link to this item:https://purl.utwente.nl/essays/104427
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page