Abstract
Theories of embodied cognition argue that human processes of thinking and reasoning are deeply connected with the actions and perceptions of the body. Recent research suggests that these theories can be successfully applied to the design of learning environments, and new technologies enable multimodal platforms that respond to students’ natural physical activity such as their gestures. This study examines how students engaged with an embodied mixed-reality science learning simulation using advanced gesture recognition techniques to support full-body interaction. The simulation environment acts as a communication platform for students to articulate their understanding of non-linear growth within different science contexts. In particular, this study investigates the different multimodal interaction metrics that were generated as students attempted to make sense of cross-cutting science concepts through using a personalized gesture scheme. Starting with video recordings of students’ full-body gestures, we examined the relationship between these embodied expressions and their subsequent success reasoning about non-linear growth. We report the patterns that we identified, and explicate our findings by detailing a few insightful cases of student interactions. Implications for the design of multimodal interaction technologies and the metrics that were used to investigate different types of students’ interactions while learning are discussed.
Original language | English (US) |
---|---|
Article number | 39 |
Number of pages | 20 |
Journal | Multimodal Technologies and Interaction |
Volume | 2 |
Issue number | 3 |
DOIs | |
State | Published - Sep 2018 |
Keywords
- Embodied learning
- Multimodal interaction metrics
- Multimodal learning environments
- Science education simulations
ASJC Scopus subject areas
- Computer Networks and Communications
- Computer Science Applications
- Human-Computer Interaction
- Neuroscience (miscellaneous)