The Weekly Newsletter of MIT Linguistics

Issue of Monday, April 15th, 2019

LF Reading Group 4/17 - Qi Hao (Harvard/Pekin University)

Speaker: Qi Hao (Harvard/Peking University)
Title: The Syntax/Semantics of Numeral Classifiers in Mandarin Chinese and Numeral Mapping Parameter
Time: Wednesday, April 17th, 1-2PM
Location: 32-D461

It is a well-known fact that classifiers are needed for grammatical counting in classifier languages such as Mandarin Chinese.

(1) a. yi ge xiaohai
one CLGENERAL child

b. san ben shu
three CLVOLUME book

c. liang ping shui
two CLBOTTLE water

There are three basic and related questions we want to ask about the Num-Cl-N Construction:

A. What is the internal constituency, [Num-Cl]-N or Num-[Cl-N]?
B. What is the function/semantics of classifiers?
C. How to account for the presence/absence of the classifier system among languages?

The mainstream analysis in generative framework is that classifiers are functional morphemes on the extended projection of nominals, functioning as divider heads (Borer 2005), type shifters from kinds to predicates (Chierchia 1998, Jiang 2012) or the syntactic counterpart of COUNT operation (Rothstein 2010, X.P. Li 2013) to make mass/kind-denoting nouns countable with numerals. That is to say, classifiers take nouns as complements and Cl-Ns further combine with numerals, structured as [Num-[Cl-N]]. And the presence of a classifier system is correlated with the absence of plural morphology as well as the absence of articles (see Chierchia 1998). However, analyses within this tradition will face difficulties when it comes to languages like Old Chinese. Old Chinese lacks both plural morphology and articles, but classifiers do not show up, as in (2).

(2) san ren xing, bi you wo shi yan. (The Analects)
three person walk must have my teacher at-there
‘If three people walk together, there must be a teacher of mine among them.’

Furthermore, it would be very hard to explain the fact for the former treatments that numerals can stand alone in predicate and argument positions in English (as in (3)) as well as in Old Chinese (as in (5)), but not in Modern Mandarin (as in (4)).

(3) a. Apples on the table are three.
b. As for apples, I bought three.

(4) a. Zhuozi-shang de pingguo shi *san/san-ge.
Table-LOCTOP MOD apple COP three/three-CLGEN. (same intended meaning as in 3a)
b. Pingguo ma, wo mai-le *san/san-ge
Apple PART, I buy-ASPT three/three-CLGEN. (same intended meaning as in 3b)

(5) a. Shi you bu ke zhi zhe san. (Records of the Grand Historian)
Thing have not can know NOM three
‘Things which cannot be known are three.’

b. Zheng yue, zuo san jun, san fen gongshi er ge you qi yi. (Zuo Zhuan)
First month, form three army, three divide country CONJ each have its one
“In the first month, (Jiwuzi) formed three armies, and divided the country into three parts and then each (of the three people) had one of them.”

Following the spirit of Semantic Parameter in Chierchia (1998), I propose that a parameter for different semantics of numerals is required to account for above facts, which can be called “Numeral Mapping Parameter”. It can be represented as follows:

(6) a. Numerals map to type in non-classifier languages such as English and Old Chinese.
b. Numerals map to type n (numbers/cardinals) in classifier languages such as Chinese.

Precisely, we should say that numerals in all languages start as individual expressions of type n denoting numbers. And the et-type use is derived from the number expression by following processes (Landman 2004, Rothstein 2009):

(7) a. three: 3 (type n)
b. IDENT(three): λn. n = 3 (raised to a predicate by IDENT)
c. function composition: λn. n = 3 °| | (composed with cardinality function)

                    = λx. |x| = 3

The derivation in (7) is a lexical rule in English, and numerals are type-et as syntactic primitives. However, Mandarin lacks such a rule in the lexicon, and hence numerals are type n when they come into syntax. Such semantic deficiency of numerals is the genuine reason why we see classifiers in Mandarin. Classifiers are of type which turns n-type numerals into et-type numerals. The semantics for classifiers can be represented as follows:

(8) a. [[Cl]] = λP λn λx [PUNIT(x) & |x|P = n] if PUNIT(x) is defined, else
[[Cl]] = λP λn λx [|x|P = n] (the case for kilos, liters)
(P stands for classifier roots)

b. [[san-Cl-ben]] = λx BENUNIT (x) & |x|BEN = 3
(paraphrase: x comes in the naturals units of volumes and the cardinality of x is three measured by the units of volumes)

Then we can say the “classifier” is a built-in semantics of English numerals (with very abstract meaning as ‘object unit’), while such a category must be independently encoded in Mandarin syntax due to Numeral Mapping Parameter. And the internal structure of the Num-Cl-N sequence is [Num-Cl]-N, contrary to most of the former treatments. Further evidence will be given to support this.

We will also make a reply to the structural ambiguity analysis for individuating readings and measuring readings of classifier construction by X.P. Li (2013) and argue that we can get the two different readings from a unified syntax. And we will make a preliminary attempt to answer the question whether the mass-count distinction really exists in classifier languages such as Mandarin.

Phonology Circle 4/17 - Anton Kukhto (MIT) presents Shih & Zuraw (2017)

In this week’s meeting of Phonology Circle, Anton Kukhto (MIT) will lead a discussion of Shih & Zuraw’s 2017 paper: Phonological conditions on variable adjective and noun word order in Tagalog. You can download the paper here.

Discussion leader: Anton Kukhto (MIT)
Paper: Shih, S. S., & Zuraw, K. (2017). Phonological conditions on variable adjective and noun word order in Tagalog. Language, 93(4), 317-352 (available here).
Time: Wednesday (4/17), 5:00-6:30pm
Location: 32-D831

Tagalog adjectives and nouns variably occur in two word orders, separated by an intermediary linker: adjective-linker-noun versus noun-linker-adjective. The linker has two phonologically conditioned surface forms, -ng and na. This article presents a large-scale corpus study of adjective/ noun order variation in Tagalog, focusing in particular on phonological conditions. Results show that word-order variation in adjective/noun pairs optimizes for phonological structure, abiding by phonotactic, syllabic, and morphophonological well-formedness preferences that are also found elsewhere in Tagalog grammar. The results indicate that surface phonological information is accessible for word-order choice.

Ling-Lunch 4/18 - Conor McDonough Quinn (University of Southern Maine)

Speaker: Conor McDonough Quinn (University of Southern Maine)
Title: Animacy, obviation, inverse, and delightful phonological subtleties: a call to look more at Passamaquoddy-Wolastoqew(Maliseet) and relatives, and helps for formal theory from endangered language pedagogy
Time: Thursday, 4/18, 12:30-1:50pm
Location: 32-D461


Starting with core discussions of three traditionally thorny phenomena Algonquian morphosyntax:

  • Algonquian grammatical animate status as formal “mission creep” from semantic animacy, via highly constrained analogical “families” that shift over time and space, but are reliably productive for any given speaker.
  • Proximate-obviative as marking referential entailment dependency: in 3rd+3rd Goal-Theme configurations (verbal-ditransitive or nominal-possessive), the Theme cannot be proximate relative to the obviative Goal; only the reverse. E.g. in [HER MOTHER], [MOTHER] must always be obviative. This restriction seems tied crucially to the observation that knowing the complete reference of [MOTHER] in [HER MOTHER] entails knowing the reference of HER.
  • Inverse for 3→{1,2} configurations in Algonquian languages being only strictly required for clause-types realized morphosyntactically as possessed nominals: in them, core transitive arguments are introduced via a Goal-Theme (Possessor-Possessee) configuration, and so are subject to a Person-Case Constraint effect.

We then briefly note three other features in desperate need of in-depth research: (a) head-marking for oblique (spatial, manner, temporal, etc.) arguments; (b) nominal tense; (c) standalone Secondary Objects (= morphosyntactically same as ditransitive Themes, but with no overt/interpreted Goal argument). Also briefly sketched are three under-researched phonetic-phonological phenomena: (a) iambic weak/strong-schwa alternations + related rich initial/final consonant clusters; (b) contrastive pitch-accent + final vowel deletion alternations; (c) preaspiration and gemination in a voicing-noncontrasting phonational system. Finally, we observe how being “reduced to” minimalist, non-technical, and pragmatic-communicatively-grounded presentations of these kinds of unfamiliar linguistic phenomena—namely, doing what it takes to teach them genuinely effectively to adult learners—creates fertile ground for innovative rethinking.

A longer version of the abstract can be found here.

CompLang 4/18 - Tal Linzen (John Hopkins University)

Speaker: Tal Linzen (Johns Hopkins University)
Title: Linguistics in the age of deep learning
Time: Thursday, 4/18, 5-6pm
Location: 32-141


Deep learning systems with minimal or no explicit linguistic structure have recently proved to be surprisingly successful in language technologies. What, then, is the role of linguistics in language technologies in the deep learning age? I will argue that the widespread use of these “black box” models provides an opportunity for a new type of contribution: characterizing the desired behavior of the system along interpretable axes of generalization from the training set, and identifying the areas in which the system falls short of that standard.

I will illustrate this approach in word prediction (language models) and natural language inference. I will show that recurrent neural network language models are able to process many syntactic dependencies in typical sentences with considerable success, but when evaluated on carefully controlled materials, their error rate increases sharply. Perhaps more strikingly, neural inference systems (including ones based on the widely popular BERT model), which appear to be quite accurate according to the standard evaluation criteria used in the NLP community, perform very poorly in controlled experiments; for example, they universally infer from “the judge chastised the lawyer” that “the lawyer chastised the judge”. Finally, if time permits, I will show how neural network models can be used to address classic questions in linguistics, in particular by providing a platform for testing for the necessity and sufficiency of explicit structural biases in the acquisition of syntactic transformations.

MIT Colloquium 4/19- Dan Lassiter (Stanford)

Speaker: Dan Lassiter (Stanford University)
Title: Mathematical Counterfactuals
Time and Place: Friday, April 19, 3:30-5:00pm, room 32-141

Counterfactual reasoning about mathematical truths (“If 7 + 5 were 11, I’d have gotten a perfect score on the math test”) presents an important challenge to standard accounts of the semantics of conditionals. I describe a semantics based on Pearl-style interventions on generative models and show that it provides a simple account of mathematical counterfactuals that also coheres well with research on mathematical cognition. The approach is related to the possible worlds theory in that the models are recipes for generating descriptions of possible worlds, but their procedural character is crucial in supporting interventions and the kind of partiality that I argue we need to render mathematical counterfactuals meaningful.