TY - JOUR
T1 - What radio waves tell us about sleep!
AU - He, Hao
AU - Li, Chao
AU - Ganglberger, Wolfgang
AU - Gallagher, Kaileigh
AU - Hristov, Rumen
AU - Ouroutzoglou, Michail
AU - Sun, Haoqi
AU - Sun, Jimeng
AU - Westover, M. Brandon
AU - Katabi, Dina
N1 - Financial disclosure: D.K. receives research funding from the NIH, NSF, Sanofi, Takada, IBM, Gwangju Institute of Science and Technology, Michael J Fox Foundation, Helmsley Charitable Trust, and the Rett Syndrome Research Trust. She is a co-founder of Emerald Innovations and has a personal equity interest in the company. She also serves on the board of directors of Cyclerion, the scientific advisory board of Janssen, and the data science advisory board of Amgen. M.B.W. receives funding from the NIH and NSF. He is a co-founder, serves as a scientific advisor and consultant to, and has a personal equity interest in Beacon Biosignals. J.S. receives research funding from the NSF. These relationships played no role in the present study. The rest of the authors have no financial disclosures. R.H. is a co-founder of Emerald Innovations and has a personal equity interest in the company. Nonfinancial disclosure: None.
This work is funded by the National Science Foundation Award No. IIS-2014391. We would like to express our gratitude to the members of Katabi\u2019s lab at MIT for their comments on our manuscript. We also thank Professor Wenlong Mou from the University of Toronto for his thorough review of the statistical methods in the paper and his insightful feedback. Additionally, we extend our thanks to all the research participants for their time and contributions. The content of this paper is solely the responsibility of the authors and does not necessarily represent the official views of NSF or other sponsors.
This work is funded by the National Science Foundation Award No. IIS-2014391. Acknowledgment
PY - 2025/1/1
Y1 - 2025/1/1
N2 - The ability to assess sleep at home, capture sleep stages, and detect the occurrence of apnea (without on-body sensors) simply by analyzing the radio waves bouncing off people’s bodies while they sleep is quite powerful. Such a capability would allow for longitudinal data collection in patients’ homes, informing our understanding of sleep and its interaction with various diseases and their therapeutic responses, both in clinical trials and routine care. In this article, we develop an advanced machine-learning algorithm for passively monitoring sleep and nocturnal breathing from radio waves reflected off people while asleep. Validation results in comparison with the gold standard (i.e. polysomnography; n = 880) demonstrate that the model captures the sleep hypnogram (with an accuracy of 80.5% for 30-second epochs categorized into wake, light sleep, deep sleep, or REM), detects sleep apnea (AUROC = 0.89), and measures the patient’s Apnea–Hypopnea Index (ICC = 0.90; 95% CI = [0.88, 0.91]). Notably, the model exhibits equitable performance across race, sex, and age. Moreover, the model uncovers informative interactions between sleep stages and a range of diseases including neurological, psychiatric, cardiovascular, and immunological disorders. These findings not only hold promise for clinical practice and interventional trials but also underscore the significance of sleep as a fundamental component in understanding and managing various diseases.
AB - The ability to assess sleep at home, capture sleep stages, and detect the occurrence of apnea (without on-body sensors) simply by analyzing the radio waves bouncing off people’s bodies while they sleep is quite powerful. Such a capability would allow for longitudinal data collection in patients’ homes, informing our understanding of sleep and its interaction with various diseases and their therapeutic responses, both in clinical trials and routine care. In this article, we develop an advanced machine-learning algorithm for passively monitoring sleep and nocturnal breathing from radio waves reflected off people while asleep. Validation results in comparison with the gold standard (i.e. polysomnography; n = 880) demonstrate that the model captures the sleep hypnogram (with an accuracy of 80.5% for 30-second epochs categorized into wake, light sleep, deep sleep, or REM), detects sleep apnea (AUROC = 0.89), and measures the patient’s Apnea–Hypopnea Index (ICC = 0.90; 95% CI = [0.88, 0.91]). Notably, the model exhibits equitable performance across race, sex, and age. Moreover, the model uncovers informative interactions between sleep stages and a range of diseases including neurological, psychiatric, cardiovascular, and immunological disorders. These findings not only hold promise for clinical practice and interventional trials but also underscore the significance of sleep as a fundamental component in understanding and managing various diseases.
KW - apnea
KW - artificial intelligence
KW - contactless at-home sleep monitoring
KW - machine learning
KW - polysomnography
KW - sleep hypnogram
UR - http://www.scopus.com/inward/record.url?scp=85215292655&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85215292655&partnerID=8YFLogxK
U2 - 10.1093/sleep/zsae187
DO - 10.1093/sleep/zsae187
M3 - Article
C2 - 39155830
AN - SCOPUS:85215292655
SN - 0161-8105
VL - 48
JO - Sleep
JF - Sleep
IS - 1
M1 - zsae187
ER -