Trust, Trustworthiness and the Moral Dimension in human-AI Interactions
DOI:
https://doi.org/10.21814/eps.7.2.6181Keywords:
Trust, Trustworthy AI, Artificial Agents, Reliance, Moral agencyAbstract
The growing use of Autonomous Agents (AAs) in both private and public sectors raises crucial questions about trust. As AI systems take on increasingly complex tasks and decisions, their interactions with human agents (HAs) raise questions about the relevance and applicability of traditional philosophical concepts of trust and trustworthiness (sections 1 and 2). In this paper, I will explore the nuances of trust in AAs, arguing against both the complete dismissal of trust as misplaced (section 4) and the application of “genuine” trust frameworks (section 5). My aim is to lay the groundwork for the understanding that the moral complexity of interactions with AAs goes beyond the mere reliance we place on inanimate objects (section 6).
References
Artificial Intelligence Act, European Parliament legislative resolution of 13 March 2024
Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A. J.-F. Bonnefon, J.F., Rahwan I. (2018). “The Moral Machine experiment.”, Nature, 563(7729).
Bryson, P., Kime, J.J., (2011), “Just an Artifact: Why Machines are Perceived as Moral Agents”, https://www.cs.bath.ac.uk/~jjb/ftp/BrysonKime-IJCAI11.pdf
Bryson, J. (2018) “AI & Global Governance: No one should trust AI. United Nations.”
Faulkner, P. (2015) The attitude of trust is basic. Analysis 75: 424–429.
Formosa, P. (2021) “Robot Autonomy vs. Human Autonomy: Social Robots, Artificial
Intelligence (AI), and the Nature of Autonomy.”, Minds & Machines 31, 595–61.
https://doi.org/10.1007/s11023-021-09579-2
Fossa, F. (2019) “«I don’t trust you, you faker!» on trust, reliance, and artificial agency.”, Teoria 39(1):63–80. https://doi.org/10.4454/ teoria.v39i1.57
Freiman, O. (2023). “Making sense of the conceptual nonsense ‘trustworthy AI’.” AI and
Ethics, 3, 1351-1360.
Google Cloud’s Approach to Trust in Arti cial Intelligence 2023 (Kaganovich, M., Kanungo,
R., Hanssen, H.)
https://services.google.com/fh/files/misc/ociso_securing_ai_governance.pdf
Hawley, K. (2014a) “Trust, Distrust and Commitment”, Noûs, 48, 1-20.
Hawley, K. (2014b) “Partiality and Prejudice in Trusting”, Synthese, 191, pp. 2029-2045.
https://doi.org/10.1007/ s11948-020-00228-y
Jones, J. (2012) “Trustworthiness.”, Ethics, 123: 61–85.
Jones, K. (1996) “Trust as an affective attitude.”, Ethics, 107(1), 4–25.
Metzinger, T. (2019) EU guidelines: Ethics washing made in Europe. Der Tagesspiegel Online.
https://www.tagesspiegel.de/politik/eth- ics-washing-made-in-europe-5937028.html
Ryan, M. (2020) “In AI we trust: ethics, artificial intelligence, and reliability.”, Sci Eng Ethics,
:2749–2767.
Sharkey, A. (2019) “Autonomous weapons systems, killer robots and human dignity.”, Ethics
Inf Technol 21, 75–87 https://doi.org/10.1007/s10676-018-9494-0
Simion, M., Kelp, C. (2023) “Trustworthy artificial intelligence.” AJPH, 2, 8.
https://doi.org/10.1007/s44204-023-00063-5
Sutrop, M. (2019) “Should we trust artificial intelligence?”, Trames Journal of the Humanities
and Social Sciences, 23(4):49.
Taddeo, M. (2009) “Defining trust and E-trust: from old theories to new problems.”, Int J
Technol Human Interact, 5(2):23–35. https://doi. org/10.4018/jthi.2009040102
Tallant, J. (2019) “You Can Trust the Ladder, But You Shouldn't”, Theoria, 85, 102-118.
Tallant, J. (2022) “Trusting What Ought to Happen”, Erkenntnis,
https://doi.org/10.1007/s10670-022- 00608-9
Tallant, J., Donati, D. (2020) “Trust: from the Philosophical to the Commercial.”, Philosophy
of Management, 19, 3–19. https://doi.org/10.1007/s40926-019-00107-y
Wheeler, M. (2019) “Autonomy”, in Dubber, M., Pasquale, F. and Das, S. (eds.), Oxford
Handbook of Ethics
of AI, Oxford University Press, New York
Zanotti, G., Petrolo, M., Chiffi, D. et al. (2023). “Keep trusting! A plea for the notion of
Trustworthy AI.” AI & Soc . https://doi.org/10.1007/s00146-023-01789-9
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Donatella Donati

This work is licensed under a Creative Commons Attribution 4.0 International License.