Skip to main content

Few−Shot Out−of−Domain Transfer Learning of Natural Language Explanations in a Label−Abundant Setup

Yordan Yordanov‚ Vid Kocijan‚ Thomas Lukasiewicz and Oana−Maria Camburu

Abstract

Training a model to provide natural language explanations (NLEs) for its predictions usually requires the acquisition of task-specific NLEs, which is time- and resource-consuming. A potential solution is the few-shot out-of-domain transfer of NLEs from a parent task with many NLEs to a child task. In this work, we examine the setup in which the child task has few NLEs but abundant labels. We establish four few-shot transfer learning methods that cover the possible fine-tuning combinations of the labels and NLEs for the parent and child tasks. We transfer explainability from a large natural language inference dataset (e-SNLI) separately to two child tasks: (1) hard cases of pronoun resolution, where we introduce the small-e-WinoGrande dataset of NLEs on top of the WinoGrande dataset, and (2) commonsense validation (ComVE). Our results demonstrate that the parent task helps with NLE generation and we establish the best methods for this setup.

Book Title
Findings of EMNLP 2022
Month
December
Pages
3486–3501
Publisher
Association for Computational Linguistics
Year
2022