TY - JOUR
T1 - A real-time eating detection system for capturing eating moments and triggering ecological momentary assessments to obtain further context
T2 - System development and validation study
AU - Morshed, Mehrab Bin
AU - Kulkarni, Samruddhi Shreeram
AU - Li, Richard
AU - Saha, Koustuv
AU - Roper, Leah Galante
AU - Nachman, Lama
AU - Lu, Hong
AU - Mirabella, Lucia
AU - Srivastava, Sanjeev
AU - De Choudhury, Munmun
AU - De Barbaro, Kaya P.
AU - Ploetz, Thomas
AU - Abowd, Gregory D.
N1 - Funding Information:
MBM was partly supported by a grant from Siemens FutureMaker Fellowship Task Order #7. MDC was partly supported by NIH grant #R01MH117172. The project was partly supported by a grant from Semiconductor Research Corporation in collaboration with Intel Labs.
Publisher Copyright:
© 2020 JMIR Publications. All rights reserved.
PY - 2020/12
Y1 - 2020/12
N2 - Background: Eating behavior has a high impact on the well-being of an individual. Such behavior involves not only when an individual is eating, but also various contextual factors such as with whom and where an individual is eating and what kind of food the individual is eating. Despite the relevance of such factors, most automated eating detection systems are not designed to capture contextual factors. Objective: The aims of this study were to (1) design and build a smartwatch-based eating detection system that can detect meal episodes based on dominant hand movements, (2) design ecological momentary assessment (EMA) questions to capture meal contexts upon detection of a meal by the eating detection system, and (3) validate the meal detection system that triggers EMA questions upon passive detection of meal episodes. Methods: The meal detection system was deployed among 28 college students at a US institution over a period of 3 weeks. The participants reported various contextual data through EMAs triggered when the eating detection system correctly detected a meal episode. The EMA questions were designed after conducting a survey study with 162 students from the same campus. Responses from EMAs were used to define exclusion criteria. Results: Among the total consumed meals, 89.8% (264/294) of breakfast, 99.0% (406/410) of lunch, and 98.0% (589/601) of dinner episodes were detected by our novel meal detection system. The eating detection system showed a high accuracy by capturing 96.48% (1259/1305) of the meals consumed by the participants. The meal detection classifier showed a precision of 80%, recall of 96%, and F1 of 87.3%. We found that over 99% (1248/1259) of the detected meals were consumed with distractions. Such eating behavior is considered "unhealthy" and can lead to overeating and uncontrolled weight gain. A high proportion of meals was consumed alone (680/1259, 54.01%). Our participants self-reported 62.98% (793/1259) of their meals as healthy. Together, these results have implications for designing technologies to encourage healthy eating behavior. Conclusions: The presented eating detection system is the first of its kind to leverage EMAs to capture the eating context, which has strong implications for well-being research. We reflected on the contextual data gathered by our system and discussed how these insights can be used to design individual-specific interventions.
AB - Background: Eating behavior has a high impact on the well-being of an individual. Such behavior involves not only when an individual is eating, but also various contextual factors such as with whom and where an individual is eating and what kind of food the individual is eating. Despite the relevance of such factors, most automated eating detection systems are not designed to capture contextual factors. Objective: The aims of this study were to (1) design and build a smartwatch-based eating detection system that can detect meal episodes based on dominant hand movements, (2) design ecological momentary assessment (EMA) questions to capture meal contexts upon detection of a meal by the eating detection system, and (3) validate the meal detection system that triggers EMA questions upon passive detection of meal episodes. Methods: The meal detection system was deployed among 28 college students at a US institution over a period of 3 weeks. The participants reported various contextual data through EMAs triggered when the eating detection system correctly detected a meal episode. The EMA questions were designed after conducting a survey study with 162 students from the same campus. Responses from EMAs were used to define exclusion criteria. Results: Among the total consumed meals, 89.8% (264/294) of breakfast, 99.0% (406/410) of lunch, and 98.0% (589/601) of dinner episodes were detected by our novel meal detection system. The eating detection system showed a high accuracy by capturing 96.48% (1259/1305) of the meals consumed by the participants. The meal detection classifier showed a precision of 80%, recall of 96%, and F1 of 87.3%. We found that over 99% (1248/1259) of the detected meals were consumed with distractions. Such eating behavior is considered "unhealthy" and can lead to overeating and uncontrolled weight gain. A high proportion of meals was consumed alone (680/1259, 54.01%). Our participants self-reported 62.98% (793/1259) of their meals as healthy. Together, these results have implications for designing technologies to encourage healthy eating behavior. Conclusions: The presented eating detection system is the first of its kind to leverage EMAs to capture the eating context, which has strong implications for well-being research. We reflected on the contextual data gathered by our system and discussed how these insights can be used to design individual-specific interventions.
KW - Eating behavior
KW - Eating context
KW - Eating detection
KW - Ecological momentary assessment
KW - Smartwatch
KW - Well-being
UR - http://www.scopus.com/inward/record.url?scp=85098068874&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098068874&partnerID=8YFLogxK
U2 - 10.2196/20625
DO - 10.2196/20625
M3 - Article
C2 - 33337336
AN - SCOPUS:85098068874
SN - 2291-5222
VL - 8
JO - JMIR mHealth and uHealth
JF - JMIR mHealth and uHealth
IS - 12
M1 - e20625
ER -