Whamit!

The Weekly Newsletter of MIT Linguistics

Issue of Monday, February 15th, 2021

Welcome to Spring 2021!

Welcome to the first edition of Whamit! for Spring 2021! After our winter hiatus, Whamit! is back to regular weekly editions during the semester.

Whamit! is the MIT Linguistics newsletter, published every Monday (Tuesday if Monday is a holiday). The editorial staff consists of Adam Albright, Kai von Fintel, David Pesetsky, Cater Fulang Chen, Eunsun Jou, and Margaret Wang.

To submit items for inclusion in Whamit! please send an email to whamit@mit.edu by Sunday 6pm.

Pesetsky, Keyser, and alums give lectures at the Virtual New York Institute

Our faculty members and alums taught classes at the Virtual version of the New York Institute (V-NYI), co-sponsored by Stony Brook University (USA) and the Herzen Pedagogical Institute (St. Petersburg, Russia). Below is the list of courses they taught (click on the course titles to access the course introduction).

More information about the V-NYI, including its program, can be found at https://nyi.spb.ru/ .

Miyagawa’s new Linguistic Inquiry monograph in press

Miyagawa’s Syntax in the Treetops has been accepted as a Linguistic Inquiry Monograph by MIT Press and is now in press.

Keyser gives MITAC lecture on The Mental Life of Modernism

Jay Keyser gave a lecture hosted by the MIT Activities Committee on his recent book, The Mental Life of Modernism (2020, MIT Press). A recording of this lecture can be found at this URL: https://youtu.be/bXheskQqr44 

MIT @ LSA 2021

MIT Linguistics was well represented at the 2021 Virtual Annual Meeting of the Linguistic Society of America. Many of our current students, faculty, and visitors gave presentations - some as collaboration with alums.

  • Danfeng Wu (5th year): Evasion strategies save apparent island violations in stripping
  • Tanya Bondarenko (4th year), Colin Davis (PhD 2020): What cross-clausal scrambling in Balkar reveals about phase edges
  • Daniel Asherov (4th year), Danny Fox (faculty), Roni Katzir (PhD 2008): On the Irrelevance of contextually given states for the computation of Scalar Implicatures
  • Rafael Abramovitz (6th year): Deconstructing Inverse Case Attraction
  • Tanya Bondarenko (4th year): Inverse in Passamaquoddy as the spell-out of Feature Gluttony
  • Neil Banerjee (5th year): Two ways to form a portmanteau: Evidence from ellipsis
  • Daniel Goodhue (University of Maryland), Jad Wehbe (1st year), Valentine Hacquard (PhD 2006; University of Maryland), Jeffrey Lidz (University of Maryland): That’s a question? Preschoolers’ comprehension illocutionary force, clause type and intonation
  • Suzanne Flynn (faculty), Barbara Lust (visiting professor; Cornell University), Janet Sherman (Harvard University), Charles R. Henderson (Cornell University): Binding and Coreference Dissociate in Mild Cognitive Impairment 
  • Colin Davis (PhD 2020), Patrick Elliott (postdoctoral associate): Radical successive cyclicity and the freedom of parasitic gaps

 

Our alums also presented in the conference:

  • Colin Davis (PhD 2020; University of Southern California): On parasitic gaps in relative clauses and extraction from NP
  • Kenyon Branan (PhD 2018; National University of Singapore), Michael Yoshitaka Erlewine (PhD 2014; National University of Singapore): Binding reconstruction and the types of traces
  • Sam Zukoff (PhD 2017; University of Leipzig): Deriving Arabic Verbal “Templates” without Templates
  • Joey Lim (National University of Singapore), Michael Yoshitaka Erlewine (PhD 2014; National University of Singapore): Word order and disambiguation in Pangasinan
  • Elsi Kaiser (University of Southern California), Patrick Georg Grosz (PhD 2011; University of Oslo): Anaphoricity in emoji: An experimental investigation of face and non-face emoji
  • Jeong Hwa Cho (University of Michigan), Ezra Keshet (PhD 2008; University of Michigan): Actuality and Counterfactual Implicatures in Korean Possibility and Necessity Modals
  • Tzu-Hsuan Yang (University of Kansas), Yueh-chin Chang (National Tsing Hua University), Feng-fan Hsieh (PhD 2007; National Tsing Hua University): Perceptually inconspicuous yet articulatorily distinct merger: A case study of Taiwanese Mandarin coda nasals
  • Ken Hiraiwa (PhD 2005; Meiji Gakuin University): Sluicing cannot Apply In-Situ in Japanese
  • Suyeon Im (Hanyang University), Jennifer Cole (PhD 1987; Northwestern University): Discourse meaning in perception and production of prosodic prominence
  • ​​Deepak Alok (Rutgers University), Mark Baker (PhD 1985; Rutgers University): Person and Honorification: Features and Interactions in Magahi (talk presented during the symposium on the features of allocutivity, honorifics and social relation)

 

Course announcements: Spring 2021

Course announcements in this post:

  • 24.981: Topics in computational phonology
  • 24.964: Topics in Phonology: Generative Phonetics
  • 24.956: Topics in syntax and semantics
  • 24.960: Syntactic Models

 

24.981: Topics in computational phonology

This class does not presuppose any background in modeling or programming, but it does presuppose a basic knowledge of phonological theory (i.e., from 24.961 or 24.901). 

  • Description:

Computational modeling can usefully inform many aspects of phonological theory. Implementing a theory provides a more rigorous test of its applicability to different data sets, and requires a greater degree of formal precision than is found in purely expository presentations. By training learning models on realistic training samples, we can test whether a posited analysis can actually be discovered from representative data, and we can observe what proportion of the data is actually accounted for by that analysis. Modeling also provides a direct means of testing whether a proposed formal device facilitates the discovery of generalizations, or whether it hampers learning by greatly increasing the size of the search space. In the most interesting cases, computational modeling uncovers facts about the language that would have been difficult to discover by eye, and forces us to ask which facts are treated as linguistically significant by speakers.

Topics will include: (subject to revision)

    • Statistical “baseline” models (n-gram models, exemplar models)
    • Algorithms for constraint ranking and weighting
    • Algorithms for constraint discovery
    • Integrating learned and innate constraints
    • Learning in the midst of variation and exceptions, and discovery of gradient patterns
  • Requirements:

Readings and small regular problem sets, including a small final project+presentation.
*** This class can be used to satisfy the graduate acquisition requirement, with the appropriate choice of readings, exercises, and project. Please let the instructor know if you are planning on doing this.

 

24.964: Topics in Phonology: Generative Phonetics

It is well-established that languages differ systematically in matters of fine phonetic detail such as patterns of coarticulation and contextual variation in the durations of segments, so grammars must regulate these details. However, relatively little is known about the component of grammar responsible for phonetic realization. In this course we will investigate the nature of phonetic grammars, focusing on constraint-based approaches. We will cover both theoretical issues surrounding generative phonetics and the practical skills required to develop constraint-based analyses of phonetic data.

 

24.956: Topics in syntax and semantics

  • Instructor(s): Patrick Elliott, Kai von Fintel, Danny Fox, Sabine Iatridou, David Pesetsky
  • Time: Mondays and Thursdays 3-5 pm
  • Course site: https://canvas.mit.edu/courses/7282
  • Description:

Despite the assumed theoretical primacy of declarative sentences, questions have frequently played a central role in the literature spanning syntax, semantics, and pragmatics - informing issues ranging from structure-building and combinatorics, to speech acts and their effect on the common ground. In this vein, we’ll be asking: what kinds of things are questions, how are they built, and what can they do? Specific topics we’ll cover include: question composition and pied-piping, embedded questions and question-embedding predicates, the dynamic pragmatics of questions qua illocutionary acts, and the external syntax of interrogative clauses.

You can find a preliminary syllabus on the canvas site.

  • Course requirements:

Active participation, weekly reading, weekly submission of questions and comments about the reading, final term paper on a relevant topic.

*** Students can receive credit for the advanced seminar requirement in either syntax or semantics, depending on their chosen topic for the final term paper. ***

 

24.960: Syntactic Models

The course has twin goals:

First, it gives a quick introduction to at least two “frameworks” for syntactic research that compete with the Government-Binding/Principles & Parameters/Minimalist tradition in the current syntax world:  HPSG and Lexical-Functional Grammar (LFG).  We work speedily through much of the HPSG textbook by Sag, Wasow and Bender, and also look at the LFG textbook by Bresnan, Asudeh, Toivonen and Wechsler.

Next, the class turns historical, tracing the development of generative syntax from Syntactic Structures (1957) up to the early 1980s, when HPSG and LFG first separated themselves off from the research program that became GB/P&P/Minimalism.   An overarching theme of the course is the issue of derivational vs. representational views of syntax — a theme that offers some surprising observations about who said what at various points in the history of the field, but also gives the course a focus relevant to the most current work.  

You can get a good sense of what the class will be like from its old Stellar pages — for example http://stellar.mit.edu/S/course/24/sp19/24.960.  I plan to follow essentially the same structure — but perhaps with a twist or two for the Zoom era. 

New for 2021:  We may have a guest lecture or two, and it is possible that one or another might be at a non-canonical time, and possibly might involve one or two commitments on top of the regularly scheduled classes.  Obviously these will optional, if they happen at non-normal times, but recommended — just letting you know in advance.

  • Course requirements:

As you may have heard, the sole requirements for the class are:

    1. regular attendance and participation;
    2. a few straightforward problem sets (finger exercises) in the first half of the class; and
    3. three class presentations or co-presentations (depending on numbers): of an HPSG paper, an LFG paper, and a paper from the period of generative semantics/interpretive semantics debates.  In some years, the HPSG and LFG presentations have been done together.  That will depend on what the calendar looks like when we get to that point in the semester.

There is no paper required! (A major attraction in the past.)  If you want to write a paper, in order to satisfy a program requirement, you can talk with the instructor to arrange that. Many students have reported finding this class both fun and enlightening (and not just because there is no required paper). Ask some of your predecessors for their reviews.

  • Reading material:

The most important book to order right now is the following one:

Sag, Wasow and Bender, Syntactic Theory — second edition (this is crucial).  Here are some links so you can buy it now:

https://amzn.to/2RyMEcJ

http://press.uchicago.edu/ucp/books/book/distributed/S/bo3633025.html

Please start reading it in advance of the first class.  Get as far as you can in it, so you come to the first class already somewhat prepared. This book is intended as an introduction to syntax for undergraduates, so you will find the early chapters go quickly.  But the syntax it introduces is HPSG, so fairly soon you will be learning new things and tripping over unfamiliar notations.

The books we will be using later in the semester are:

    • Bresnan et al., Lexical-Functional Grammar — please note that this too is a second edition.
    • Chomsky, Syntactic Structures

Other readings (papers and excerpts from books) will be downloadable from Canvas.