The Weekly Newsletter of MIT Linguistics

Issue of Monday, February 4th, 2019

The Spring 2019 edition of Whamit!

Welcome to the first edition of Whamit! for Spring 2019! After our winter hiatus, Whamit! is back to regular weekly editions during the semester.

Whamit! is the MIT Linguistics newsletter, published every Monday (Tuesday if Monday is a holiday). The editorial staff consists of Adam Albright, Kai von Fintel, David Pesetsky, Keny Chatain, Tracy Kelley, Elise Newman, HyunJi Yoo.

To submit items for inclusion in Whamit! please send an email to whamit@mit.edu by Sunday 6pm.

Best wishes for the new year!

New Visiting Scholars and Visiting Students for Spring 2019

Visiting Scholar

Chen Zhao (Huazhong University of Science and Technology)

I’m currently teaching French and introductory courses of linguistics to undergraduates majored in French at Huazhong University of Science and Technology (China). My research is grounded in formal syntax, with a special interest in labeling theory, Linearization, object shift, Syntax of Chinese and typological studies of different languages under the generative framework. My favorite part of the research is to be able to shed new lights on a linguistic phenomenon that has already been thoroughly studied by the scholars. I am currently involved in an investigation of the symmetry in the syntax and different strategies that grammars resort to in order to label the symmetric structures.

Visiting Student

Gregor Williamson (University College London)

I’m from London, UK. I did an undergraduate degree in English Language teaching. I did an MA in theoretical linguistics at UCL, where I am currently doing a PhD. The majority of my research is concerned with the syntax-semantics of the clausal spine (VP, TP, CP). I am particularly interested in various types of clausal embedding (non-finite clauses, adverbial clauses, attitude reports). My interests outside of linguistics include: playing the piano and climbing.

Phonology Circle 2/6: Geoffrey Schwarz (Adam Mickiewicz University)

Speaker: Geoffrey Schwarz (Adam Mickiewicz University)

Title: There is no such thing as [voice] - evidence from Polish and Polish-accented English

Date/Time: Wednesday (2/6), 5:00pm-6:30pm

Location: 32-D831

Abstract: available here

MorPhun - ftjftuf

Speaker: ftjftuf
Title: utffut
Time: , 5pm - 6:30pm
Location: 32-D831

Abstract: ftutut

MorPhun - laksdjl

Speaker: laksdjl
Time: , 5pm - 6:30pm
Location: 32-D831

Abstract: joshed

Course Announcements: Spring 2019

  • 24.942: Topics in the Grammar of a Less Familiar Language
  • 24.947: Language Disorders in Children
  • 24.954: Pragmatics in Linguistic Theory
  • 24.960: Syntactic Models
  • 24.964: Topics in Phonology
  • 24.966J: Laboratory on the Physiology, Acoustics, and Perception of Speech
  • 24.979: Topics in Semantics
  • 24.981: Topics in Computational Phonology

24.942 Topics in the Grammar of a Less Familiar Language
Kenstowicz, Richards
Students will work with a native speaker of Javanese, examining aspects of its syntax, semantics, and phonology. In the course of doing this, students will acquire techniques for gathering linguistic data from native speakers.

24.947 Language Disorders in Children
Reading and discussion of current linguistic theory, first language acquisition and language disorders in young children. Focus on development of a principled understanding of language disorders at the phonological, morphological and syntactic levels. Examines ways in which these disorders confront theories of language and acquisition.

24.954 Pragmatics in Linguistic Theory
Fox, Levy
Formal theories of context-dependency, presupposition, implicature, context-change, focus and topic. Special emphasis on the division of labor between semantics and pragmatics. Applications to the analysis of quantification, definiteness, presupposition projection, conditionals and modality, anaphora, questions and answers.

24.960 Syntactic Models
Comparison of different proposed architectures for the syntax module of grammar. Subject traces several themes across a wide variety of approaches, with emphasis on testable differences among models. Models discussed include ancient and medieval proposals, structuralism, early generative grammar, generative semantics, government-binding theory/minimalism, LFG, HPSG, TAG, functionalist perspectives and others.

24.964 Topics in Phonology
Description: The goal of this class is to understand some of the results of linguistic research using the Artificial Grammar paradigm (Reber 1967, 1993).  The AG studies we consider compare learning outcomes between subject groups exposed to artificial languages that are designed to differ just in whether they present ‘natural’ or ‘unnatural’ patterns. The definition of ‘naturalness’ varies, and understanding what should be meant by this term will be one focus of discussion. Naturalness is commonly defined in terms of attestation – with natural taken to mean well-attested – but this needs rethinking. 
Most of the large body of phonological AG work now available dates from the last 15 years. The hypothesis commonly tested is that the naturalness of the target pattern is a determinant of learning success. When this hypothesis is supported, subjects succeed in learning the natural pattern and fail with the unnatural one. In many studies, however, null results are obtained: subjects acquire to the same extent patterns identified as natural and unnatural. We will begin the course by exploring one interpretation of these mixed results, due to Moreton and Pater 2012a,b, and Moreton 2008: the successfully learned natural patterns are not necessarily natural in a typological, or phonetic sense, but they are simpler than the patterns that the subjects fail to learn. This idea comes with its own learning algorithm and with a distinct view of how phonological typology relates to competence, or doesn’t. It is a useful way to explore large parts of the AG literature but we will see that it has limitations. A different idea to consider is that studies that have successfully shown that natural patterns are easier to learn than unnatural ones have focused on faithfulness, or can be reinterpreted in those terms. The null results are found in the markedness domain.  If this interpretation continues to look promising, we’ll look for an explanation.
There are other potentially interesting results in the phonological AG work and we may expand the reading list and consider syntax if participants express an interest. I can’t promise answers to any of the questions raised above, but the best of this AG body of literature tackles two essential questions in the field: do aspects of a speaker’s grammatical knowledge reflect the linguistic typology? what are the sources of this knowledge?

24.966J Laboratory on the Physiology, Acoustics, and Perception of Speech
Braida, Shattuck-Hufnagel, Choi
Experimental investigations of speech processes. Topics include computer-aided waveform analysis and spectral analysis of speech; synthesis of speech; perception and discrimination of speech-like sounds; speech prosody; models of speech recognition; speech development; analysis of atypical speech; and others. Recommended prerequisite: 6.002, 18.03, or 24.900.

24.979 Topics in Semantics
Fox, Hackl, Schwarzschild
The seminar will focus on topics in degree semantics. Issues to be addressed include:

  • how degree operators interact with individual and modal quantifiers 
  • interpretation of degree modifiers of noun phrases (e.g. at least)
  • analysis of relative clauses modifying nominalized degree predicates (height, weight)
  • whether degrees are primitive or derived 
  • analysis of degree constructions using vectors or directed segments
  • external and internal syntax of comparative clauses

Our starting point will be Schwarzschild 2019, which we will later try to use as the basis for the discussion of older literature. Some of the sessions will be lead by registered students in collaboration with the instructors. 

24.981 Topics in Computational Phonology
Computational modeling can usefully inform many aspects of phonological theory. Implementing a theory provides a more rigorous test of its applicability to different data sets, and requires a greater degree of formal precision than is found in purely expository presentations. By training learning models on realistic training samples, we can test whether a posited analysis can actually be discovered from representative data, and we can observe what proportion of the data is actually accounted for by that analysis. Modeling also provides a direct means of testing whether a proposed formal device facilitates the discovery of generalizations, or whether it hampers learning by greatly increasing the size of the search space. In the most interesting cases, computational modeling uncovers facts about the language that would have been difficult to discover by eye, and forces us to ask which facts are treated as linguistically significant by speakers.
Topics will include: (subject to revision)
- Statistical “baseline” models (n-gram models, exemplar models)
- Algorithms for constraint ranking and weighting
- Algorithms for constraint discovery
- Integrating learned and innate constraints
- Learning in the midst of variation and exceptions, and discovery of gradient patterns
Requirements: readings and small regular problem sets, final project+presentation.

MIT Colloquium (2/8) - Martina Wiltschko (UBC)

Speaker: Martina Wiltschko

Title: How to do things with nominals. Towards a syntax of nominal speech acts
Time: Friday, February 8th, 4pm-5:30pm

Room: 32-155

Abstract: Going back to Aristotle, classic grammatical description as well as current theories of grammar and the construction of meaning take the sentence to be the object of investigation. In his seminal work, Austin 1962 took a first step towards breaking with this tradition within philosophy of language. He argued that our understanding of meaning has to be informed by the fact that when we say things, we also do things. Different types of sentences give rise to different speech acts such as asserting, questioning, requesting, promising etc. and over the past sixty years, evidence has accumulated suggesting that speech act meaning is part of sentences, and hence part of grammatical structure. However, when we do things with words, we do not always use sentences. For example, a content question may be answered with a nominal phrase only (e.g., Who wrote that essay? Penelope). And by uttering a nominal we are also doing something. Specifically, we are attempting to identify a referent in a manner that will enable our interlocutors to recognize them.

This leads us to postulate a novel hypothesis, namely that noun phrases (like sentences) may be full-fledged speech acts. We refer to this as the nominal speech act hypothesis. Specifically, we postulate that, just like clauses, nominals are dominated by a layer of structure which encodes pragmatic information contributing to their use-conditions. They differ in that the speech act layer in the clausal domain establishes a direct link between the speech-act participants and the proposition denoted by the clause, while the speech act layer in the nominal domain establishes a direct link between the speech-act participants and the referent denoted by the nominal.

In this talk I provide empirical evidence for a dedicated nominal speech act layer. Specifically, the evidence I discuss comes from i) cross-linguistic variation in pronominal paradigms; ii) properties of impersonal pronouns; iii) properties of formality distinctions in nominal expressions; iv) distinctions between spatial and discourse functions of demonstratives.

MIT @ LSA 2019

The Linguistic Society of America’s Annual Meeting for 2019 was held at in New York in January. As per usual, MIT was well represented. The following department members presented talks and posters:

Alumni who presented or organised symposia include: Ezra Keshat, Michael Yoshitaka Erlewine, Jon Nissenbaum, Aron Hirsch, Michelle Yuan, Hadas Kotek, Coppe Van Urk.