What Is Understood ? Simulating Human-Scale Word Comprehension Using AI

Project Overview

Project

What Is Understood ? Simulating Human-Scale Word Comprehension Using AI

Project members:

Dr Sacha Beniamine
Prof Erich Round

Period of award

April 2023 - March 2026

Funder:

Leverhulme Early Career Fellowship

Human languages can express an infinity of meanings, thanks to their "double articulation": meaningless sounds combine into meaningful words, which themselves combine into sentences. Moreover, we constantly produce and understand sentences and words that we have never heard before. A central question for linguistic theory is how this unique ability, and the knowledge supporting it, are organized in our brain. In the past decades, linguists have been developing theories of how speakers participate in this cognitively demanding behaviour, yet in comparison the equally impressive role of listeners, who must process and comprehend this creativity in real time, has received little attention. Unfortunately, we cannot directly observe our minds in action. Instead, linguists can build theories from language data, which can then be tested in experimental settings. In this Leverhulme Early Career Fellowship, I aim to elaborate a theory of how our comprehension of words might be represented in the mind by modelling this task using neural networks.

TOP
close