Skip to main content

Fast Context Adaptation via Meta−Learning

Luisa Zintgraf‚ Kyriacos Shiarlis‚ Vitaly Kurin‚ Katja Hofmann and Shimon Whiteson

Abstract

We propose CAVIA for meta-learning, a simple extension to MAML that is less prone to meta-overfitting, easier to parallelise, and more interpretable. CAVIA partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks. At test time, only the context parameters are updated, leading to a low-dimensional task representation. We show empirically that CAVIA outperforms MAML for regression, classification, and reinforcement learning. Our experiments also highlight weaknesses in current benchmarks, in that the amount of adaptation needed in some cases is small.

Book Title
ICML 2019: Proceedings of the Thirty−Sixth International Conference on Machine Learning
Month
June
Year
2019