Frank Mtumbuka : Publications
-
[1]
Beyond Distributional Hypothesis: Let Language Models Learn Meaning−Text Correspondence
Myeongjun Jang‚ Frank Martin Mtumbuka and Thomas Lukasiewicz
In Findings of NAACL 2022‚ Seattle‚ Washington‚ USA‚ July 2022. Association for Computational Linguistics. 2022.
Details about Beyond Distributional Hypothesis: Let Language Models Learn Meaning−Text Correspondence | BibTeX data for Beyond Distributional Hypothesis: Let Language Models Learn Meaning−Text Correspondence | Link to Beyond Distributional Hypothesis: Let Language Models Learn Meaning−Text Correspondence
-
[2]
Systematic Comparison of Neural Architectures and Training Approaches for Open Information Extraction
Patrick Hohenecker‚ Frank Mtumbuka‚ Vid Kocijan and Thomas Lukasiewicz
In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing‚ EMNLP 2020‚ November 16–20‚ 2020. November, 2020.
Details about Systematic Comparison of Neural Architectures and Training Approaches for Open Information Extraction | BibTeX data for Systematic Comparison of Neural Architectures and Training Approaches for Open Information Extraction | Link to Systematic Comparison of Neural Architectures and Training Approaches for Open Information Extraction