MemoNet: Memorizing All Cross Features’ Representations Efficiently via Multi-Hash Codebook Network for CTR Prediction

By Pengtao Zhang et al
Published on Oct. 21, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. INTRODUCTION
2. RELATED WORKS
3. PRELIMINARIES
4. METHODOLOGY
4.1 Multi-Hash Codebook
4.2 HCNet

Summary

The document discusses the importance of memorization in natural language processing and introduces a memory mechanism to learn and memorize cross features efficiently in CTR prediction models. It presents the multi-Hash Codebook NETwork (HCNet) for this purpose, combining it with a DNN backbone in a model called MemoNet. The proposed approach demonstrates superior performance in CTR tasks and shows scalability similar to large language models in NLP. The document also covers related works, preliminaries of DNN models, and the methodology of HCNet, including multi-hash addressing, memory restoring, and feature shrinking phases.
×
This is where the content will go.