Whamit!

The Weekly Newsletter of MIT Linguistics

BCS Special Language Seminar 10/12 - Vera Demberg

BCS SPECIAL LANGUAGE SEMINAR
TUESDAY, OCT. 12, 10:00 AM, 46-4062
Vera Demberg, University of Edinburgh
A Broad-Coverage Model of Prediction in Human Sentence Processing
Host: Ted Gibson

Recent psycholinguistic experiments have provided evidence for prediction in human language comprehension. However, none of the current sentence processing theories provide explicit mechanisms for the modeling the prediction process. Furthermore, two previous theories of sentence processing, Dependency Locality Theory (DLT) and Surprisal, have been argued to capture different aspects of processing difficulty. In this talk, I propose a new theory of sentence processing which incorporates a mechanism for modeling the prediction and verification processes in human language understanding, and which integrates aspects from Surprisal and DLT integration cost into a unified framework.

The theory is implemented based on a Psycholinguistically motivated Tree Adjoining Grammar (PLTAG), a variant of TAG that allows for strictly incremental parsing. I will briefly talk about the design of PLTAG and the algorithm for the incremental parser.

I evaluate the validity of the sentence processing theory in its PLTAG implementation on a range of specific psycholinguistic phenomena and show that it captures aspects of processing difficulty which previous sentence processing theories could not capture simultaneously. A theory of language processing in humans should however not only work in an experimentally designed environment, but should also have explanatory power for naturally occurring language. I therefore also evaluate the sentence processing theory on the eye-tracking records of newspaper texts, and show that it can explain a significant amount of the variance in the eye-movement data, and that it does so better than either Surprisal or DLT integration cost.