Attention Calibration for Transformer-based Sequential Recommendation

By Peilin Zhou et al
Published on Oct. 21, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
2. Related Work
2.1 Sequential Recommendation
2.2 Debates on Attention Mechanism
3. Preliminary
3.1 Problem Setup
3.2 Transformer-based Recommenders
3.2.1 Embedding Layer
3.2.2 Transformer Block

Summary

Transformer-based sequential recommendation has gained significant attention in recent years, with a focus on the self-attention mechanism. The paper introduces the Attention Calibration for Transformer-based Sequential Recommendation (AC-TSR) framework to address the limitations of attention weights learned in transformer-based models. By incorporating Spatial Calibrator and Adversarial Calibrator, AC-TSR aims to improve the quality of attention weights for more accurate predictions. Experimental results demonstrate the superiority of AC-TSR over existing methods. The paper provides insights into the challenges and solutions in the field of sequential recommendation.
×
This is where the content will go.