TY - GEN
T1 - A Pre-trained Zero-shot Sequential Recommendation Framework via Popularity Dynamics
AU - Wang, Junting
AU - Rathi, Praneet
AU - Sundaram, Hari
N1 - This work was generously supported by the National Science Foundation (NSF) under grant number 2312561. We also would like to thank the anonymous reviewers for their valuable feedback.
PY - 2024/10/8
Y1 - 2024/10/8
N2 - This paper proposes a novel pre-trained framework for zero-shot cross-domain sequential recommendation without auxiliary information. While using auxiliary information (e.g., item descriptions) seems promising for cross-domain transfer, a cross-domain adaptation of sequential recommenders can be challenging when the target domain differs from the source domain—item descriptions are in different languages; metadata modalities (e.g., audio, image, and text) differ across source and target domains. If we can learn universal item representations independent of the domain type (e.g., groceries, movies), we can achieve zero-shot cross-domain transfer without auxiliary information. Our critical insight is that user interaction sequences highlight shifting user preferences via the popularity dynamics of interacted items. We present a pre-trained sequential recommendation framework: PrepRec, which utilizes a novel popularity dynamics-aware transformer architecture. Through extensive experiments on five real-world datasets, we show that PrepRec, without any auxiliary information, can zero-shot adapt to new application domains and achieve competitive performance compared to state-of-the-art sequential recommender models. In addition, we show that PrepRec complements existing sequential recommenders. With a simple post-hoc interpolation, PrepRec improves the performance of existing sequential recommenders on average by 11.8% in Recall@10 and 22% in NDCG@10. We provide an anonymized implementation of PrepRec at https://github.com/CrowdDynamicsLab/preprec.
AB - This paper proposes a novel pre-trained framework for zero-shot cross-domain sequential recommendation without auxiliary information. While using auxiliary information (e.g., item descriptions) seems promising for cross-domain transfer, a cross-domain adaptation of sequential recommenders can be challenging when the target domain differs from the source domain—item descriptions are in different languages; metadata modalities (e.g., audio, image, and text) differ across source and target domains. If we can learn universal item representations independent of the domain type (e.g., groceries, movies), we can achieve zero-shot cross-domain transfer without auxiliary information. Our critical insight is that user interaction sequences highlight shifting user preferences via the popularity dynamics of interacted items. We present a pre-trained sequential recommendation framework: PrepRec, which utilizes a novel popularity dynamics-aware transformer architecture. Through extensive experiments on five real-world datasets, we show that PrepRec, without any auxiliary information, can zero-shot adapt to new application domains and achieve competitive performance compared to state-of-the-art sequential recommender models. In addition, we show that PrepRec complements existing sequential recommenders. With a simple post-hoc interpolation, PrepRec improves the performance of existing sequential recommenders on average by 11.8% in Recall@10 and 22% in NDCG@10. We provide an anonymized implementation of PrepRec at https://github.com/CrowdDynamicsLab/preprec.
KW - Recommender System
KW - Zero-shot Sequential Recommendation
UR - http://www.scopus.com/inward/record.url?scp=85210523314&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85210523314&partnerID=8YFLogxK
U2 - 10.1145/3640457.3688145
DO - 10.1145/3640457.3688145
M3 - Conference contribution
AN - SCOPUS:85210523314
T3 - RecSys 2024 - Proceedings of the 18th ACM Conference on Recommender Systems
SP - 433
EP - 443
BT - RecSys 2024 - Proceedings of the 18th ACM Conference on Recommender Systems
PB - Association for Computing Machinery
T2 - 18th ACM Conference on Recommender Systems, RecSys 2024
Y2 - 14 October 2024 through 18 October 2024
ER -