Gesture Patterns and Learning in an Embodied XR Science Simu

Jina Kang, Morgan Diederich, Robb Lindgren, Michael Junokas

Research output: Contribution to journalArticlepeer-review


Recent research has emphasized the importance of leveraging embodied interactions for learning critical Stem concepts. Elastic3s—an embodied environment for learning about cross-cutting concepts (i.e., non-linear growth)—allows learners to interact with different science simulations through whole-body gestures. Technological advances in gesture recognition can track and respond to students’ gestures, however, there has been little investigation into how the gestures performed in these environments relate to subsequent learning. The need for sequential pattern recognition methods is critical in embodied learning if we are to understand how gestural interaction with a simulation facilitates learning. Using data collected via Microsoft Kinect V2 from twelve college students, we applied multivariate Dynamic Time Warping for clustering to identify gestural patterns in Elastic3s as evidence for embodied learning processes. Our findings showed that identified trends of simulation use were indicative of students’ struggles to understand the underlying ideas or use of the system and were associated with learning performance. These indicators can potentially be used to leverage real time, in-simulation assistance and promote a more adaptive learning experience via embodied simulations.

Original languageEnglish (US)
Pages (from-to)77-92
Number of pages16
JournalEducational Technology and Society
Issue number2
StatePublished - 2021


  • DTW clustering
  • Embodied learning
  • Gesture recognition
  • Time series analysis
  • XR Science education simulations

ASJC Scopus subject areas

  • Education
  • Sociology and Political Science
  • General Engineering

Cite this