Abstract
Chatbots are regarded as a promising technology for delivering guidance. Prior studies show that chatbots have the potential of coaching users to learn different skills; however, several limitations of chatbot-based approaches remain. People may become disengaged from using chatbot-guided systems and fail to follow the guidance for complex tasks. In this paper, we design chatbots with (HC) and without (OC) human support to deliver guidance for people to practice journaling skills. We conducted a mixed-method study with 35 participants to investigate their actual interaction, perceived interaction, and the effects of interacting with the two chatbots. The participants were randomly assigned to use one of the chatbots for four weeks. Our results show that the HC participants followed the guidance more faithfully during journaling practices and perceived a significantly higher level of engagement and trust with the chatbot system than the OC participants. However, after finishing the journaling-skill training session, the OC participants were more willing to keep using the learned skills than the HC participants. Our work provides new insights into the design of integrating human support into chatbot-based interventions for delivering guidance.
Original language | English (US) |
---|---|
Article number | 3449196 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 5 |
Issue number | CSCW1 |
DOIs | |
State | Published - Apr 22 2021 |
Keywords
- Chatbot
- Conversational agent
- Human support
- Self-disclosure
- Training
ASJC Scopus subject areas
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications