Skip to main content
×
Home
    • Aa
    • Aa

Error-driven learning in Optimality Theory and Harmonic Grammar: a comparison*

  • Giorgio Magri (a1)
Abstract

OT error-driven learning admits guarantees of efficiency, stochastic tolerance and noise robustness which hold independently of any substantive assumptions on the constraints. This paper shows that the HG learner used in the current literature does not admit such constraint-independent guarantees. The HG theory of error-driven learning thus needs to be substantially restricted to specific constraint sets.

Copyright
Corresponding author
E-mail: magrigrg@gmail.com.
Footnotes
Hide All
*

Parts of this paper were presented at the 21st Manchester Phonology Meeting in 2013 and at the 11th Old World Conference in Phonology in 2014. I wish to thank Paul Boersma and Joe Pater for useful discussion. Three anonymous reviewers and the associate editor of the journal also provided me with detailed and valuable suggestions. The research reported in this paper was supported by a grant from the Fyssen Research Foundation, as well as by a Marie Curie Intra European Fellowship within the 7th European Community Framework Programme.

Appendices providing more technical details and simulation results can be found in supplementary online materials at https://doi.org/10.1017/S0952675716000221.

Footnotes
Linked references
Hide All

This list contains references from the content that can be linked to their source. For a full set of references and notes please see the PDF or HTML where available.

Max Bane , Jason Riggle & Morgan Sonderegger (2010). The VC dimension of constraint-based grammars. Lingua 120. 11941208.

H. D. Block (1962). The perceptron: a model of brain functioning. Review of Modern Physics 34. 123135.

Nicolò Cesa-Bianchi & Gábor Lugosi (2006). Prediction, learning, and games. Cambridge: Cambridge University Press.

Andries W. Coetzee & Joe Pater (2011). The place of variation in phonological theory. In John Goldsmith , Jason Riggle & Alan Yu (eds.) The handbook of phonological theory. 2nd edn. Malden, Mass. & Oxford: Wiley-Blackwell. 401434.

Michael Collins (2002). Discriminative training methods for hidden Markov models: theory and experiments with perceptron algorithms. In Jan Haji & Yuji Matsumoto (eds.) Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP) . Stroudsburg, PA: Association for Computational Linguistics. 18.

Nello Cristianini & John Shawe-Taylor (2000). An introduction to Support Vector Machines and other kernel-based methods. Cambridge: Cambridge University Press.

Yoav Freund & Robert E. Schapire (1999). Large margin classification using the perceptron algorithm. Machine Learning 37. 277296.

Johannes Fürnkranz & Eyke Hüllermeier (2010). Preference learning. Berlin & Heidelberg: Springer.

Jeffrey Heinz (2011). Computational phonology. Part I: Foundations. Language and Linguistics Compass 5. 140152.

Gerhard Jäger & Anette Rosenbach (2006). The winner takes it all – almost: cumulativity in grammatical variation. Linguistics 44. 937971.

Gaja Jarosz (2010). Implicational markedness and frequency in constraint-based computational models of phonological learning. Journal of Child Language 37. 565606.

Gaja Jarosz (2013). Learning with hidden structure in Optimality Theory and Harmonic Grammar: beyond Robust Interpretive Parsing. Phonology 30. 2771.

René Kager , Joe Pater & Wim Zonneveld (eds.) (2004). Constraints in phonological acquisition. Cambridge: Cambridge University Press.

Jyrki Kivinen (2003). Online learning of linear classifiers. In Shahar Mendelson & Alexander J. Smola (eds.) Advanced lectures on machine learning. Berlin & Heidelberg: Springer. 235257.

Norbert Klasner & Hans Ulrich Simon (1995). From noise-free to noise-tolerant and from on-line to batch learning. In Wolfgang Maass (ed.) Proceedings of the 8th Annual Conference on Computational Learning Theory (COLT) . New York: ACM. 250257.

Clara C. Levelt , Niels O. Schiller & Willem J. Levelt (2000). The acquisition of syllable types. Language Acquisition 8. 237264.

Giorgio Magri (2012b). Convergence of error-driven ranking algorithms. Phonology 29. 213269.

Giorgio Magri (2015). How to keep the HG weights non-negative: the truncated Perceptron reweighting rule. Journal of Language Modelling 3. 345375.

Giorgio Magri (2016). Noise robustness and stochastic tolerance of OT error-driven ranking algorithms. Journal of Logic and Computation 26. 959988.

Mehryar Mohri , Afshin Rostamizadeh & Ameet Talwalkar (2012). Foundations of machine learning. Cambridge, Mass.: MIT Press.

Joe Pater (2009). Weighted constraints in generative linguistics. Cognitive Science 33. 9991035.

Alan Prince & Paul Smolensky (2004). Optimality Theory: constraint interaction in generative grammar. Malden, Mass. & Oxford: Blackwell.

Jason Riggle (2009). The complexity of ranking hypotheses in Optimality Theory. Computational Linguistics 35. 4759.

Frank Rosenblatt (1958). The perceptron: a probabilistic model for information storage and organization in the brain. Psychological Review 65. 386408.

Shai Shalev-Shwartz & Yoram Singer (2005). A new perspective on an old Perceptron algorithm. In Peter Auer & Ron Meir (eds.) Learning theory. Berlin & Heidelberg: Springer. 264278.

Bruce Tesar (2013). Output-driven phonology: theory and learning. Cambridge: Cambridge University Press.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Phonology
  • ISSN: 0952-6757
  • EISSN: 1469-8188
  • URL: /core/journals/phonology
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×
Type Description Title
UNKNOWN
Supplementary Materials

Magri supplementary material
Magri supplementary material 1

 Unknown (3.1 MB)
3.1 MB

Metrics

Full text views

Total number of HTML views: 5
Total number of PDF views: 75 *
Loading metrics...

Abstract views

Total abstract views: 194 *
Loading metrics...

* Views captured on Cambridge Core between 16th January 2017 - 20th September 2017. This data will be updated every 24 hours.