Complexity-Based Prompting for Multi-Step Reasoning

By Y. Fu et al.
Published on Jan. 30, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
2. Related Work
3. Complexity-Based Prompting
4. Experiments

Summary

This paper studies the task of prompting large-scale language models for multi-step reasoning. The authors propose complexity-based prompting as a simple and effective example selection scheme for multi-step reasoning. They show that prompts with higher reasoning complexity achieve substantially better performance on multi-step reasoning tasks. By selecting complex prompts over simple ones, the authors demonstrate new state-of-the-art performance on various math benchmarks and other reasoning tasks. The experiments validate that complexity-based prompting consistently outperforms existing example selection schemes with minimal annotation budget.
×
This is where the content will go.