TY - GEN
T1 - Classification of Infant Sleep/Wake States
T2 - 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2023
AU - Chang, Kai Chieh
AU - Hasegawa-Johnson, Mark
AU - McElwain, Nancy L.
AU - Islam, Bashima
N1 - We would like to thank the families who participated in this research, as well as Jordan Bodway and Jenny Baldwin for their assistance with data collection and processing. This work was supported by funding from the National Institute of Mental Health (R21MH112578), National Institute of Drug Abuse (R34DA050256), the National Institute of Food and Agriculture (ILLU-793-368), and the Personalized Nutrition Initiative and Center for Social and Behavioral Science at the University of Illinois at Urbana-Champaign through the Seed Grant program.
PY - 2023
Y1 - 2023
N2 - Infant sleep is critical to brain and behavioral development. Prior studies on infant sleep/wake classification have been largely limited to reliance on expensive and burdensome polysomnography (PSG) tests in the laboratory or wearable devices that collect single-modality data. To facilitate data collection and accuracy of detection, we aimed to advance this field of study by using a multi-modal wearable device, LittleBeats (LB), to collect audio, electrocardiogram (ECG), and inertial measurement unit (IMU) data among a cohort of 28 infants. We employed a 3-branch (audio/ECG/IMU) large scale transformer-based neural network (NN) to demonstrate the potential of such multi-modal data. We pretrained each branch independently with its respective modality, then finetuned the model by fusing the transformer layers with cross-attention. We show that multimodal data significantly improves sleep/wake classification (accuracy = 0.880), compared with use of a single modality (accuracy = 0.732). Our approach to multi-modal mid-level fusion may be adaptable to a diverse range of architectures and tasks, expanding future directions of infant behavioral research.
AB - Infant sleep is critical to brain and behavioral development. Prior studies on infant sleep/wake classification have been largely limited to reliance on expensive and burdensome polysomnography (PSG) tests in the laboratory or wearable devices that collect single-modality data. To facilitate data collection and accuracy of detection, we aimed to advance this field of study by using a multi-modal wearable device, LittleBeats (LB), to collect audio, electrocardiogram (ECG), and inertial measurement unit (IMU) data among a cohort of 28 infants. We employed a 3-branch (audio/ECG/IMU) large scale transformer-based neural network (NN) to demonstrate the potential of such multi-modal data. We pretrained each branch independently with its respective modality, then finetuned the model by fusing the transformer layers with cross-attention. We show that multimodal data significantly improves sleep/wake classification (accuracy = 0.880), compared with use of a single modality (accuracy = 0.732). Our approach to multi-modal mid-level fusion may be adaptable to a diverse range of architectures and tasks, expanding future directions of infant behavioral research.
UR - http://www.scopus.com/inward/record.url?scp=85180013245&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85180013245&partnerID=8YFLogxK
U2 - 10.1109/APSIPAASC58517.2023.10317201
DO - 10.1109/APSIPAASC58517.2023.10317201
M3 - Conference contribution
AN - SCOPUS:85180013245
T3 - 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2023
SP - 2370
EP - 2377
BT - 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 31 October 2023 through 3 November 2023
ER -