TY - GEN
T1 - Semi-Supervised Contrastive Learning for Human Activity Recognition
AU - Liu, Dongxin
AU - Abdelzaher, Tarek
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Recent developments in deep learning have motivated the use of deep neural networks in mobile sensing applications. Human Activity Recognition (HAR), as one of the most important mobile sensing applications, has enjoyed great success due to the utilization of deep neural networks. Motivated by the success of self-supervised learning frameworks in computer vision and natural language processing, self-supervised models have been proposed to efficiently leverage massive unlabeled data and reduce the labeling burden of HAR applications. Current approaches use self-supervised pre-training (with unlabeled data) followed by downstream training (with labeled data). However, we claim that labeled data can still help in the pre-training process and propose SemiC-HAR, a Semi-supervised Contrastive learning framework for HAR. SemiC-HAR efficiently uses both of the labeled and unlabeled data during the pre-training process and combines the advantages of supervised and self-supervised contrastive learning frameworks. We evaluate SemiC-HAR on six HAR datasets with multiple sensing signals and show comparable performance to previous supervised and semi-supervised models seen at much lower fractions of labeled data.
AB - Recent developments in deep learning have motivated the use of deep neural networks in mobile sensing applications. Human Activity Recognition (HAR), as one of the most important mobile sensing applications, has enjoyed great success due to the utilization of deep neural networks. Motivated by the success of self-supervised learning frameworks in computer vision and natural language processing, self-supervised models have been proposed to efficiently leverage massive unlabeled data and reduce the labeling burden of HAR applications. Current approaches use self-supervised pre-training (with unlabeled data) followed by downstream training (with labeled data). However, we claim that labeled data can still help in the pre-training process and propose SemiC-HAR, a Semi-supervised Contrastive learning framework for HAR. SemiC-HAR efficiently uses both of the labeled and unlabeled data during the pre-training process and combines the advantages of supervised and self-supervised contrastive learning frameworks. We evaluate SemiC-HAR on six HAR datasets with multiple sensing signals and show comparable performance to previous supervised and semi-supervised models seen at much lower fractions of labeled data.
KW - Contrastive Learning
KW - Human Activity Recognition
KW - Representation Learning
KW - Self-supervision
UR - http://www.scopus.com/inward/record.url?scp=85123320231&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123320231&partnerID=8YFLogxK
U2 - 10.1109/DCOSS52077.2021.00019
DO - 10.1109/DCOSS52077.2021.00019
M3 - Conference contribution
AN - SCOPUS:85123320231
T3 - Proceedings - 17th Annual International Conference on Distributed Computing in Sensor Systems, DCOS 2021
SP - 45
EP - 53
BT - Proceedings - 17th Annual International Conference on Distributed Computing in Sensor Systems, DCOS 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 17th Annual International Conference on Distributed Computing in Sensor Systems, DCOS 2021
Y2 - 14 July 2021 through 16 July 2021
ER -