‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.

Original languageEnglish (US)
Title of host publicationIntelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2
EditorsYaxin Bi, Rahul Bhatia, Supriya Kapoor
PublisherSpringer-Verlag
Pages994-1013
Number of pages20
ISBN (Print)9783030295127
DOIs
StatePublished - Jan 1 2020
EventIntelligent Systems Conference, IntelliSys 2019 - London, United Kingdom
Duration: Sep 5 2019Sep 6 2019

Publication series

NameAdvances in Intelligent Systems and Computing
Volume1038
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

ConferenceIntelligent Systems Conference, IntelliSys 2019
CountryUnited Kingdom
CityLondon
Period9/5/199/6/19

Fingerprint

Design of experiments
Controllers
Experiments

Keywords

  • Decision-aids automation
  • Human-agent interaction
  • Personality traits
  • Social psychology
  • Trust formation

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science(all)

Cite this

Huang, H. Y., Twidale, M. B., & Bashir, M. N. (2020). ‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective. In Y. Bi, R. Bhatia, & S. Kapoor (Eds.), Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2 (pp. 994-1013). (Advances in Intelligent Systems and Computing; Vol. 1038). Springer-Verlag. https://doi.org/10.1007/978-3-030-29513-4_73

‘If you agree with me, do i trust you?’ : An examination of human-agent trust from a psychological perspective. / Huang, Hsiao Ying; Twidale, Michael Bernard; Bashir, Masooda N.

Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2. ed. / Yaxin Bi; Rahul Bhatia; Supriya Kapoor. Springer-Verlag, 2020. p. 994-1013 (Advances in Intelligent Systems and Computing; Vol. 1038).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Huang, HY, Twidale, MB & Bashir, MN 2020, ‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective. in Y Bi, R Bhatia & S Kapoor (eds), Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2. Advances in Intelligent Systems and Computing, vol. 1038, Springer-Verlag, pp. 994-1013, Intelligent Systems Conference, IntelliSys 2019, London, United Kingdom, 9/5/19. https://doi.org/10.1007/978-3-030-29513-4_73
Huang HY, Twidale MB, Bashir MN. ‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective. In Bi Y, Bhatia R, Kapoor S, editors, Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2. Springer-Verlag. 2020. p. 994-1013. (Advances in Intelligent Systems and Computing). https://doi.org/10.1007/978-3-030-29513-4_73
Huang, Hsiao Ying ; Twidale, Michael Bernard ; Bashir, Masooda N. / ‘If you agree with me, do i trust you?’ : An examination of human-agent trust from a psychological perspective. Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2. editor / Yaxin Bi ; Rahul Bhatia ; Supriya Kapoor. Springer-Verlag, 2020. pp. 994-1013 (Advances in Intelligent Systems and Computing).
@inproceedings{8f611dc6679e4f0ebf8310c7159c752f,
title = "‘If you agree with me, do i trust you?’: An examination of human-agent trust from a psychological perspective",
abstract = "Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.",
keywords = "Decision-aids automation, Human-agent interaction, Personality traits, Social psychology, Trust formation",
author = "Huang, {Hsiao Ying} and Twidale, {Michael Bernard} and Bashir, {Masooda N}",
year = "2020",
month = "1",
day = "1",
doi = "10.1007/978-3-030-29513-4_73",
language = "English (US)",
isbn = "9783030295127",
series = "Advances in Intelligent Systems and Computing",
publisher = "Springer-Verlag",
pages = "994--1013",
editor = "Yaxin Bi and Rahul Bhatia and Supriya Kapoor",
booktitle = "Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2",

}

TY - GEN

T1 - ‘If you agree with me, do i trust you?’

T2 - An examination of human-agent trust from a psychological perspective

AU - Huang, Hsiao Ying

AU - Twidale, Michael Bernard

AU - Bashir, Masooda N

PY - 2020/1/1

Y1 - 2020/1/1

N2 - Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.

AB - Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.

KW - Decision-aids automation

KW - Human-agent interaction

KW - Personality traits

KW - Social psychology

KW - Trust formation

UR - http://www.scopus.com/inward/record.url?scp=85072825270&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85072825270&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-29513-4_73

DO - 10.1007/978-3-030-29513-4_73

M3 - Conference contribution

AN - SCOPUS:85072825270

SN - 9783030295127

T3 - Advances in Intelligent Systems and Computing

SP - 994

EP - 1013

BT - Intelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2

A2 - Bi, Yaxin

A2 - Bhatia, Rahul

A2 - Kapoor, Supriya

PB - Springer-Verlag

ER -