Do We Need Explainable AI in Companies? Investigation of Challenges, Expectations, and Chances from Employees' Perspective

arxiv(2022)

引用 0|浏览1
暂无评分
摘要
By using AI, companies want to improve their business success and innovation chances. However, in doing so, they (companies and their employees) are faced with new requirements. In particular, legal regulations call for transparency and comprehensibility of AI systems. The field of XAI deals with these issues. Currently, the results are mostly obtained in lab studies, while the transfer to real-world applications is lacking. This includes considering employees' needs and attributes, which may differ from end-users in the lab. Therefore, this project report paper provides initial insights into employees' specific needs and attitudes towards (X)AI. For this, the results of a project's online survey are reported that investigate two employees' perspectives (i.e., company level and employee level) on (X)AI to create a holistic view of challenges, risks, and needs of employees. Our findings suggest that AI and XAI are well-known terms perceived as important for employees. This is a first step for XAI to be a potential driver to foster the successful usage of AI by providing transparent and comprehensible insights into AI technologies. To benefit from (X)AI technologies, supportive employees on the management level are valuable catalysts. This work contributes to the ongoing demand for XAI research to develop human-centered and domain-specific XAI designs.
更多
查看译文
关键词
explainable explainable,employees,companies,expectations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要