Speaker: Aleksei Nazarov (Harvard) Title: Learning parametric stress without domain-specific mechanisms Date/Time: Monday, October 3, 5:00-6:30 Location: 32-D831
(Joint work with Gaja Jarosz (UMass))
A parametric approach to the acquisition of stress (Dresher and Kaye 1990, Hayes 1995) is attractive for defining a small learning space. However, previous approaches (Dresher and Kaye 1990, Pearl 2007, 2011) have argued that domain-general learners, such as the Naïve Parameter Learner (NPL; Yang 2002), are not sufficient for learning stress parameters, and that UG contains domain-specific mechanisms for individual parameters: substantive “cues” as well as a parameter acquisition order. We argue that these conclusions are premature, and we instead propose to modify the non-selective way in which parameters are updated in the NPL.
Our proposed Expectation Driven Parameter Learner (EDPL) augments the NPL with a (linear-time) Expectation Maximization component along the lines of Jarosz (2015). Without using domain-specific mechanisms, we show that the novel EDPL performs very well (96% accuracy) on a representative subset of the typology defined by Dresher and Kaye (1990), while the NPL performs very poorly (4.3% accuracy). This suggests that UG can be kept simpler (parameters only, instead of parameters + cues + order) if the learner is allowed to process individual data points more thoroughly.