Skip to Main content Skip to Navigation
Conference papers

Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning

Abstract : Recent years have seen impressive performance of transformer-based models on different natural language processing tasks. However, it is not clear to what degree the transformers can reason on natural language. To shed light on this question, this survey paper discusses the performance of transformers on different reasoning tasks, including mathematical reasoning, commonsense reasoning, and logical reasoning. We point out successes and limitations, of both empirical and theoretical nature.
Document type :
Conference papers
Complete list of metadata

https://hal-imt.archives-ouvertes.fr/hal-03344668
Contributor : Fabian Suchanek Connect in order to contact the contributor
Submitted on : Wednesday, September 15, 2021 - 10:52:22 AM
Last modification on : Tuesday, September 21, 2021 - 3:37:20 AM

File

akbc-2021-reasoning.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03344668, version 1

Citation

Chadi Helwe, Chloé Clavel, Fabian Suchanek. Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning. International Conference on Automated Knowledge Base Construction (AKBC), 2021, online, United States. ⟨hal-03344668⟩

Share

Metrics

Record views

10

Files downloads

3