DeText: A Deep Text Ranking Framework with BERT

By Weiwei Guo et al
Read the original document by opening this link in a new tab.

Table of Contents

1. INTRODUCTION
2. RELATED WORK
3. SEARCH SYSTEMS AT LINKEDIN
4. DETEXT FRAMEWORK FOR BERT-BASED RANKING MODEL
5. ONLINE DEPLOYMENT STRATEGY

Summary

Ranking is crucial in search systems dealing with natural language data. Deep learning models like BERT have shown promise in improving ranking. DeText investigates building an efficient BERT-based ranking model, resulting in DeText, an open-sourced framework. DeText-BERT fine-tunes BERT for ranking, achieving significant improvements in real-world search systems. The framework offers flexibility and efficiency by supporting various neural network components. Two deployment strategies, document pre-computing and two-pass ranking, address serving latency challenges in online deployment.
×
This is where the content will go.