Skip to Main Content
The online publications platform for Operational Research Society members
3,886
Views
59
CrossRef citations to date
0
Altmetric
Article

Transparency and trust in artificial intelligence systems

, &
Pages 260-278
Received 09 Oct 2019
Accepted 08 Jul 2020
Published online: 10 Sep 2020
 

ABSTRACT

Assistive technology featuring artificial intelligence (AI) to support human decision-making has become ubiquitous. Assistive AI achieves accuracy comparable to or even surpassing that of human experts. However, often the adoption of assistive AI systems is limited by a lack of trust of humans into an AI’s prediction. This is why the AI research community has been focusing on rendering AI decisions more transparent by providing explanations of an AIs decision. To what extent these explanations really help to foster trust into an AI system remains an open question. In this paper, we report the results of a behavioural experiment in which subjects were able to draw on the support of an ML-based decision support tool for text classification. We experimentally varied the information subjects received and show that transparency can actually have a negative impact on trust. We discuss implications for decision makers employing assistive AI technology.

Disclosure statement

No potential conflict of interest was reported by the authors.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.