An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
WITHDRAWN January 4, 2023 The artificial intelligence (AI) revolution is upon us, with the promise of advances such as driverless cars, smart buildings, automated health diagnostics and improved security monitoring. Many current efforts are aimed to measure system trustworthiness, through measurements of Accuracy, Reliability, and Explainability, among other system characteristics. While these characteristics are necessary, determining that the AI system is trustworthy because it meets its system requirements won't ensure widespread adoption of AI. It is the user, the human affected by, the AI who ultimately places their trust in the system. The study of trust in automated systems has been a topic of psychological study previously. However, Artificial Intelligence (AI) systems pose unique challenges for user trust. AI systems operate using patterns in massive amounts of data . No longer are we asking automation to do human tasks, we are asking it to do tasks that we can't. Moreover, AI has the ability to learn and alter its own programming in ways we don't easily understand. The AI user has to trust the AI because of its complexity and unpredictability, changing the dynamic between user and system into a relationship. Alongside research toward building trustworthy systems, understanding user trust in AI will be necessary in order to achieve the benefits and minimize the risks of this new technology.
Citation
NIST Interagency/Internal Report (NISTIR) - NIST IR 8332-draft
Stanton, B.
and Jensen, T.
(2021),
Trust and Artificial Intelligence (Draft), NIST Interagency/Internal Report (NISTIR), National Institute of Standards and Technology, Gaithersburg, MD, [online], https://doi.org/10.6028/NIST.IR.8332-draft, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=931087
(Accessed November 20, 2024)