I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation

By Chandra Bhagavatula et al.
Published on May 26, 2023
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
1 Introduction
2 The I2D2 Framework
2.1 Prompt Construction
2.2 Constrained Generation using NeuroLogic Decoding
2.3 Supervised Critic
2.4 Self-Imitation Learning

Summary

I2D2 is a framework that explores the use of smaller language models to achieve competitive commonsense acquisition without relying solely on scale. It introduces a novel commonsense distillation framework, I2D2, that enhances the generation quality of weak language models using NeuroLogic Decoding and self-imitation learning. The framework aims to generate generic statements about everyday concepts, surpassing the quality of knowledge generated by larger models like GPT-3. Through constrained decoding and self-imitation learning, I2D2 achieves promising results, outperforming GPT-3 in precision. Additionally, I2D2 creates a new corpus of generics, Gen-A-tomic, which is larger and more accurate than existing resources like GenericsKB.
×
This is where the content will go.