A Universal Representation Transformer Layer for Few-Shot Image Classification

Lu Liu
Will Hamilton
Guodong Long
Jing Jiang
(2021)

Abstract

Few-shot classification aims to recognize unseen classes given only few samples.
We consider the problem of multi-domain few-shot image classification, where unseen classes and examples come from diverse data sources. This problem has seen growing interest and has inspired the development of benchmarks such as Meta-Dataset. A key challenge in this multi-domain setting is effectively integrating the feature representations from the diverse set of training domains.
Here, we propose a Universal Representation Transformer (URT) layer, that meta-learns to leverage universal features for few-shot classification by dynamically re-weighting and composing the most appropriate domain-specific representations.
In experiments, we show that URT sets a new state-of-the-art result on Meta-Dataset.
Specifically, it outperforms the best previous model on 3 data sources and otherwise matches it on the others.
We analyze variants of URT and present a visualization of the attention score heatmaps that sheds light on how the model performs cross-domain generalization.

Research Areas