Exploring the Complementary Features of Audio and Text Notes for Video-based Learning in Mobile Settings

Si Chen, Dennis Wang, Yun Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we compared audio and text note-taking behaviors for video-based learning in various mobile settings. We designed and implemented a note-taking tool and conducted a task-based study to examine how users took audio and text notes differently. Our results show that participants' audio notes were significantly longer than text notes; longer audio notes were taken to capture unfamiliar video content and participants' emotions. However, audio notes also raised several privacy concerns. Text notes allowed participants to revise for better accuracy and deeper reflection. Our findings of the complementary features of audio and text notes for video-based learning shed light on designing future note-taking tools that can be used to facilitate learning in varied mobile settings.

Original languageEnglish (US)
Title of host publicationExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, CHI EA 2021
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450380959
DOIs
StatePublished - May 8 2021
Event2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, CHI EA 2021 - Virtual, Online, Japan
Duration: May 8 2021May 13 2021

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Conference

Conference2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, CHI EA 2021
Country/TerritoryJapan
CityVirtual, Online
Period5/8/215/13/21

Keywords

  • Audio interaction
  • Mixed-method study
  • Note-taking

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design
  • Software

Cite this