Gesture-mediated collaboration with augmented reality headsets in a problem-based astronomy task

James Planey, Robin Jephthah Rajarathinam, Emma Mercier, Robb Lindgren

Research output: Contribution to journalArticlepeer-review

Abstract

Extended reality technologies such as headset-based augmented reality (AR) unlock unique opportunities to integrate gestures into the collaborative problem-solving process. The following qualitative study documents the collection and analysis of group interaction data in an astronomy sky simulation across AR and tablet technologies in a classroom setting. A total of 15 groups were coded for episodes of on-task problem-solving, conceptual engagement, and use of gesture. Analysis of coded interactions assisted in identifying vignettes facilitating exploration, orientation, perspective sharing, and communication of mental models. In addition, the use of gesture by some groups enabled the creation of shared situated conceptual spaces, bridging the AR and tablet experiences and facilitating collaborative exchange of spatial information. The patterns of gesture and collaborative knowledge interactions documented here have implications for the design of future collaborative learning environments leveraging extended reality technologies.

Original languageEnglish (US)
Pages (from-to)259-289
Number of pages31
JournalInternational Journal of Computer-Supported Collaborative Learning
Volume18
Issue number2
DOIs
StatePublished - Jun 2023

Keywords

  • Astronomy
  • Augmented reality
  • Collaborative learning
  • Embodied learning
  • Gesture

ASJC Scopus subject areas

  • Education
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Gesture-mediated collaboration with augmented reality headsets in a problem-based astronomy task'. Together they form a unique fingerprint.

Cite this