Infinite Mixture Prototypes for Few-Shot Learning

By K. R. Allen et al.
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
1. Introduction
2. Background
3. Infinite Mixture Prototypes (IMP)
3.1. Adapting capacity by learning cluster variance
3.2. Adapting capacity by multi-modal clustering
4. Ablations and Alternatives

Summary

The document discusses the proposal of infinite mixture prototypes for adaptively representing simple and complex data distributions for few-shot learning. By inferring the number of clusters, infinite mixture prototypes interpolate between nearest neighbor and prototypical representations, improving accuracy and robustness. The importance of adaptive capacity and multi-modal clustering are highlighted, showing significant accuracy improvements over existing methods. The paper also extends the approach to semi-supervised and unsupervised clustering. Various methodologies and background concepts for few-shot classification are explained in detail.
×
This is where the content will go.