Using one-shot machine learning to implement real-time multimodal learning analytics

Michael J. Junokas, Greg Kohlburn, Sahil Kumar, Benjamin Lane, Wai Tat Fu, Robb Lindgren

Research output: Contribution to journalConference articlepeer-review


Educational research has demonstrated the importance of embodiment in the design of student learning environments, connecting bodily actions to critical concepts. Gestural recognition algorithms have become important tools in leveraging this connection but are limited in their development, focusing primarily on traditional machine-learning paradigms. We describe our approach to real-time learning analytics, using a gesture-recognition system to interpret movement in an educational context. We train a hybrid parametric, hierarchical hidden-Markov model using a one-shot construct, learning from singular, user-defined gestures. This model gives us access to three different modes of data streams: skeleton positions, kinematics features, and internal model parameters. Such a structure presents many challenges including anticipating the optimal feature sets to analyze and creating effective mapping schemas. Despite these challenges, our method allows users to facilitate productive simulation interactions, fusing of these streams into embodied semiotic structures defined by the individual. This work has important implications for the future of multimodal learning analytics and educational technology.

Original languageEnglish (US)
Pages (from-to)89-93
Number of pages5
JournalCEUR Workshop Proceedings
StatePublished - 2017
EventJoint 6th Multimodal Learning Analytics Workshop and the Second Cross-LAK Workshop, MMLA-CrossLAK 2017 - Vancouver, Canada
Duration: Mar 14 2017 → …


  • Cognitive embodiment
  • Educational technology
  • Gesture recognition
  • One-shot machine learning

ASJC Scopus subject areas

  • General Computer Science


Dive into the research topics of 'Using one-shot machine learning to implement real-time multimodal learning analytics'. Together they form a unique fingerprint.

Cite this