Skip to main content

Knowledge Base Completion Meets Transfer Learning

Vid Kocijan and Thomas Lukasiewicz

Abstract

The aim of knowledge base completion is to predict unseen facts from existing facts in knowledge bases. In this work, we introduce the first approach for transfer of knowledge from one collection of facts to another without the need for entity or relation matching. The method works for both canonicalized knowledge bases and uncanonicalized or open knowledge bases, i.e., knowledge bases where more than one copy of a real-world entity or relation may exist. Such knowledge bases area natural output of automated information extraction tools that extract structured data from unstructured text. Our main contribution is a method that can make use of a large-scale pre-training on facts, collected from unstructured text, to improve predictions on structured data from a specific domain. The introduced method is the most impactful on small datasets such as ReVerb20K, where we obtained 6% absolute increase of mean reciprocal rank and 65% relative decrease of mean rank over the previously best method, despite not relying on large pre-trained models like BERT.

Book Title
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing‚ EMNLP 2021‚ Online and in the Barceló Bávaro Convention Centre‚ Punta Cana‚ Dominican Republic‚ November 7–11‚ 2021
Month
November
Year
2021