Whamit!

The Weekly Newsletter of MIT Linguistics

Course announcements: Spring 2023

Course announcements in this post:

  • Topics in Syntax (24.956)
  • Topics in Semantics (24.979)
  • Topics in Computational Phonology (24.981)
  • Topics in Computational Linguistics (24.982)
  • Special topics: Linguistics in K-12 education (24.S95)

24.956: Topics in Syntax

Subject is one of the most fundamental and most frequently appealed to notions in the discussion of argument asymmetries cross-linguistically. Subjects are taken to display a cluster of properties, which in tree-geometric terms are associated with being the structurally highest argument in the clause. Properties typically associated with subjects include: (i) unmarked (nominative) case; (ii) the ability to control verbal agreement; (iii) the ability to bind anaphors; (iv) the ability to be PRO and to participate in raising; (v) agentivity and thematic prominence; (vi) topicality; (vii) accessibility for wh-movement. In modern Minimalism these properties are distributed across several positions in the clause, but tend to converge on a single nominal due to standard constraints on locality and movement. In this seminar we will explore phenomena that challenge a universally homogeneous notion of subjecthood, focusing on cases where the subject displays only a subset of typical subjecthood properties, or where subjecthood properties are distributed across more than one argument in the clause. We will discuss both the empirical landscape of research on subjecthood and the implications that research has for syntactic theory and our understanding of locality, intervention, licensing, case, agreement, thematic and structural prominence, etc.


24.979: Topics in Semantics 

This semester, we will explore the philosophy of natural language semantics, or meta-semantics. The overarching question will be: what has to be the case for a prominent branch of formal semantics, often referred to as Heim-and-Kratzer semantics, and various specific proposal made within it over years, to make any sense? The course will be split in two parts. In part one, our attention will be focused on what makes formal semantics formal: the emphasis on entailment and contradiction. These concepts seem to play important and diverse roles in semantic theorizing and, by extension, linguistic theory. What does this tell us about language and, by extension, how the mind works? In part two, we will discuss a series of issues, some of which may already be raised in part one. Possible topics include internalism vs externalism; the prospects for referential semantics; the idea of natural language metaphysics/ontology; the position of semantics vis-a-vis cognitive science and/or philosophy; questions of expressive power and type economy; issues of modularity; the connection between language and thought; critiques of mainstream formal semantics from authors like Chomsky, Jackendoff, and Pietroski.

We concur with the following from Bob Stalnaker’s seminar description: “The schedule will be flexible and open ended, following the discussion where it leads, and spending as much time on each topic as it seems to need. That is, we will make it up as we go along.”

As usual, to receive credit, we expect active participation in seminar meetings, weekly emailed questions and comments, and a final paper.


24.981: Topics in Computational Phonology

This year’s installment of 24.981 (Topics in Computational Phonology) will focus on categorical phonology and will address the perennial issue of OT’s strict domination versus HG’s constraint weighting and the additive (or cumulative or gang) effects precluded to the former but allowed by the latter. The tentative plan includes the following issues.

[1] Defining additive effects.
I will put forward a new, purely extensional definition of ‘additive’ effects. We will then discuss it through a couple of concrete examples
that have figured prominently in the literature on additive effects. If the definition makes sense, it affords a way to talk about additive effects (and thus of comparing OT and HG) that makes no reference to constraints, weights, or rankings.

[2] OT and HG as two sides of the same coin. I will introduce the OT and HG implementations of constraint-based
phonology from scratch, trying to formally deduce both of them from the same axiom on additive effects in one fell swoop. This suggests
that, when OT and HG are construed within the huge space of all logically possible implementations of constraint-based phonology, they are quite similar in terms of additive effects: neither of them yields many.

[3] Additive effects in OT and HG.
We will then focus on additive effects in HG and OT, trying to understand which additive effects are indeed within HG’s reach and
which instead require something like ‘constraint conjunction’ in both OT and HG.

[4] OT as ‘margin-free’ HG.
We will then switch to learnability, focusing on ‘online’ or ‘error-driven’ learners (that is, learners that discard each piece of data after having encountered it, rather than storing it). I will review the classical ERCD/GLA theory of online learning in OT. We will then discuss the ideas of ‘margin’ and ‘kernel trick’ in the context of HG. And I will try to conclude that HG, contrary to OT, comes with no good online learners.

[5] Strict domination and exponential update rules.
OT’s strict domination can be mimicked by HG weighting, as long as the weights decay exponentially fast (relative to the size of the constraints). Building on this observation, we will investigate whether learning algorithms with an exponential update rule can be rebooted as OT learning algorithms. We will focus on two cases: ‘AdaBoost’, a batch iterative algorithm; and Winnow, an online algorithm.

Quoting from Amir and Kai’s quote of Bob’s seminar description: “The schedule will be flexible and open ended, following the discussion
where it leads, and spending as much time on each topic as it seems to need. That is, we will make it up as we go along.” As for requirements, I propose the following three: (1) a squib to turn in at the end of the course; (2) four simple problem sets (one every two weeks, during the first two months of the course); (3) taking turns at transcribing classes (that will be taught out of a lean handout, mostly at the blackboard). This course can be used to satisfy the program’s acquisition requirement with a suitable choice of the topic for the final project. Please consult me at the beginning of the semester if you are planning to do so.


24.982: Topics in Computational Linguistics

We will be exploring the relationship between computational models and linguistic theory, with a particular focus on neural models of language (e.g., GPT-3). The main theme of the course will be how neural models should relate to a theory of language. As a means of orienting our initial discussions, we will focus on three papers:

  1. Wilcox, Futrell, and Levy. (2022). Using Computational Models to Test Syntactic Learnability. Linguistic Inquiry.
  2. Baroni. (2022). On the proper role of linguistically-oriented deep net analysis in linguistic theorizing. In Algebraic Structures in Natural Language.
  3. Steinert-Threlkeld and Szymanik (2019). Learnability and semantic universals. Semantics and Pragmatics.

Taking these papers in turn, we will touch on a variety of topics ranging from the relationship (or lack thereof) between grammaticality and probability, acceptability judgments, learnability, and the poverty of the stimulus. Background material will be supplemented as needed and topics expanded depending on interest. Possible additional directions include, probing models for syntactic trees (e.g., Hewitt and Manning, 2019, A Structural Probe for Finding Syntax in Word Representations), superficialism (e.g., Rey,, 2020, Representation of Language), and/or meta-learning for adding linguistic knowledge to models (e.g., McCoy et al., 2020, Universal linguistic inductive biases via meta-learning).

No programming will be necessary for this course. Instead, the goal is to bring linguists from a variety of backgrounds in conversation with recent developments in computational modeling (and the excitement around their ‘abilities’). Supplemental interactive code notebooks may be circulated for those interested in engaging more deeply with the computational experiments highlighted in the course.

For those enrolled there are three requirements, i) active participation in class discussions, ii) posting comments/questions/thoughts on readings via canvas, and iii) a squib to be submitted at the end of the course. Visitors are welcome – either regularly or sporadically! Please send me your email address if you are not registered so that I can add you to the canvas.

Quoting from Giorgio’s quote from Amir and Kai’s quote of Bob’s seminar description: “The schedule will be flexible and open ended, following the discussion where it leads, and spending as much time on each topic as it seems to need. That is, we will make it up as we go along.”


24.S95: Special topics: Linguistics in K-12 education

  • Instructor: Maya Honda
  • Wednesdays, 2-5pm
  • Room: 26-142

In this seminar, we will explore the idea that the study of language in K-12 (kindergarten-grade 12) education can be a means to develop young people’s understanding of scientific inquiry as well as their understanding of the nature of language. We will examine the view that the native language knowledge that each student brings to the classroom comprises a rich, accessible database, which can be used to give students the opportunity to become familiar with the methods, concepts, and attitudes of scientific inquiry. We will probe past and current efforts to engage young learners in linguistic inquiry and consider how to advance this work.

The challenge of this seminar is to create pedagogical materials and methods that will motivate learners of all ages to be inquisitive about their native language and about language in general, with a primary focus on secondary school students (grades 6-12). Seminar participants will work with one another and in partnership with K-12 teachers whenever possible.

There are two prerequisites for the seminar: the first is that you come motivated to making linguistic inquiry accessible to all and the second is that you come committed to collaborating with others in this work. Previous experience teaching linguistics at any level is welcome, but not required. Graduate students from other departments and undergraduates are also welcome if they have taken a linguistics course or if they have the instructor’s approval.