Skip to Main content Skip to Navigation
Conference papers

Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning

Abstract : Recent years have seen impressive performance of transformer-based models on different natural language processing tasks. However, it is not clear to what degree the transformers can reason on natural language. To shed light on this question, this survey paper discusses the performance of transformers on different reasoning tasks, including mathematical reasoning, commonsense reasoning, and logical reasoning. We point out successes and limitations, of both empirical and theoretical nature.
Document type :
Conference papers
Complete list of metadata
Contributor : Fabian Suchanek Connect in order to contact the contributor
Submitted on : Wednesday, September 15, 2021 - 10:52:22 AM
Last modification on : Thursday, January 27, 2022 - 3:45:37 AM
Long-term archiving on: : Thursday, December 16, 2021 - 6:34:13 PM


Files produced by the author(s)


  • HAL Id : hal-03344668, version 1



Chadi Helwe, Chloé Clavel, Fabian Suchanek. Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning. International Conference on Automated Knowledge Base Construction (AKBC), 2021, online, United States. ⟨hal-03344668⟩



Record views


Files downloads