MAE4Rec

Proceedings of the 31st ACM International Conference on Information & Knowledge Management(2022)

Cited 0|Views3
No score
Abstract
Sequential recommender systems (SRS) aim to infer the users' preferences from their interaction history and predict items that will be of interest to the users. The majority of SRS models typically incorporate all historical interactions for next-item recommendations. Despite their success, feeding all interactions into the model without filtering may lead to severe practical issues: (i) redundant interactions hinder the SRS model from capturing the users' intentions; (ii) the computational cost is huge, as the computational complexity is proportional to the length of the interaction sequence; (iii) more memory space is necessitated to store all interaction records from all users. To this end, we propose a novel storage-saving SRS framework, MAE4Rec, based on a unidirectional self-attentive mechanism and masked autoencoder. Specifically, in order to lower the storage consumption, MAE4Rec first masks and discards a large percentage of historical interactions, and then infers the next interacted item solely based on the latent representation of unmarked ones. Experiments on two real-world datasets demonstrate that the proposed model achieves competitive performance against state-of-the-art SRS models with more than 40% compression of storage.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined