Skip to main content
×
×
Home
  • Get access
    Check if you have access via personal or institutional login
  • Cited by 59
  • Cited by
    This chapter has been cited by the following publications. This list is generated based on data provided by CrossRef.

    Chandlee, Jane and Heinz, Jeffrey 2018. Strict Locality and Phonological Maps. Linguistic Inquiry, Vol. 49, Issue. 1, p. 23.

    Temperley, David and Gildea, Daniel 2018. Minimizing Syntactic Dependency Lengths: Typological/Cognitive Universal?. Annual Review of Linguistics, Vol. 4, Issue. 1, p. 67.

    Gazdag, Zsolt and Tichler, Krisztián 2017. Developments in Language Theory. Vol. 10396, Issue. , p. 173.

    Bauer, Alexander Braun, Mikio and Muller, Klaus-Robert 2017. Accurate Maximum-Margin Training for Parsing With Context-Free Grammars. IEEE Transactions on Neural Networks and Learning Systems, Vol. 28, Issue. 1, p. 44.

    Bensch, Suna and Hoeberechts, Maia 2017. Descriptional Complexity of Formal Systems. Vol. 10316, Issue. , p. 65.

    Carl, Michael Bangalore, Srinivas and Schaeffer, Moritz 2016. Border Crossings. Vol. 126, Issue. , p. 225.

    Kallmeyer, Laura 2016. Formal Grammar. Vol. 9804, Issue. , p. 77.

    Bozşahin, Cem 2016. Fundamental Issues of Artificial Intelligence. p. 95.

    van Trijp, Remi 2016. Chopping down the syntax tree. Belgian Journal of Linguistics, Vol. 30, Issue. , p. 15.

    Murphy, Elliot 2015. Labels, cognomes, and cyclic computation: an ethological perspective. Frontiers in Psychology, Vol. 6, Issue. ,

    Murphy, Elliot 2015. Reference, Phases and Individuation: Topics at the Labeling-Interpretive Interface. Opticon1826,

    Benítez-Burraco, Antonio and Boeckx, Cedric 2014. Universal Grammar and Biological Variation: An EvoDevo Agenda for Comparative Biolinguistics. Biological Theory, Vol. 9, Issue. 2, p. 122.

    Bianchi, Valentina and Chesi, Cristiano 2014. Subject Islands, Reconstruction, and the Flow of the Computation. Linguistic Inquiry, Vol. 45, Issue. 4, p. 525.

    BECERRA-BONACHE, Leonor and JIMÉNEZ LÓPEZ, M. Dolores 2014. Linguistic Models at the Crossroads of Agents, Learning and Formal Languages. ADCAIJ: ADVANCES IN DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE JOURNAL, Vol. 3, Issue. 4, p. 67.

    Ghahari, Shima 2014. Iranian bilinguals’ reverse code-switching: an implicit quest for power?. Journal of World Languages, Vol. 1, Issue. 3, p. 171.

    Moot, Richard 2014. Categories and Types in Logic, Language, and Physics. Vol. 8222, Issue. , p. 297.

    PULLUM, GEOFFREY K. 2013. The Central Question in Comparative Syntactic Metatheory. Mind & Language, Vol. 28, Issue. 4, p. 492.

    Muresan, Smaranda and Klavans, Judith L. 2013. Inducing terminologies from text: A case study for the consumer health domain. Journal of the American Society for Information Science and Technology, Vol. 64, Issue. 4, p. 727.

    Carlucci, Lorenzo and Case, John 2013. On the Necessity of U-Shaped Learning. Topics in Cognitive Science, Vol. 5, Issue. 1, p. 56.

    Kallmeyer, Laura 2013. Linear Context-Free Rewriting Systems. Language and Linguistics Compass, Vol. 7, Issue. 1, p. 22.

    ×
  • Print publication year: 1985
  • Online publication date: January 2010

6 - Tree adjoining grammars: How much context-sensitivity is required to provide reasonable structural descriptions?

Summary

Since the late 1970s there has been vigorous activity in constructing highly constrained grammatical systems by eliminating the transformational component either totally or partially. There is increasing recognition of the fact that the entire range of dependencies that transformational grammars in their various incarnations have tried to account for can be captured satisfactorily by classes of rules that are nontransformational and at the same time highly constrained in terms of the classes of grammars and languages they define.

Two types of dependencies are especially important: subcategorization and filler-gap dependencies. Moreover, these dependencies can be unbounded. One of the motivations for transformations was to account for unbounded dependencies. The so-called nontransformational grammars account for the unbounded dependencies in different ways. In a tree adjoining grammar (TAG) unboundedness is achieved by factoring the dependencies and recursion in a novel and linguistically interesting manner. All dependencies are defined on a finite set of basic structures (trees), which are bounded. Unboundedness is then a corollary of a particular composition operation called adjoining. There are thus no unbounded dependencies in a sense.

This factoring of recursion and dependencies is in contrast to transformational grammars (TG), where recursion is defined in the base and the transformations essentially carry out the checking of the dependencies. The phrase linking grammars (PLGs) (Peters and Ritchie, 1982) and the lexical functional grammars (LFGs) (Kaplan and Bresnan, 1983) share this aspect of TGs; that is, recursion builds up a set a structures, some of which are then filtered out by transformations in a TG, by the constraints on linking in a PLG, and by the constraints introduced via the functional structures in an LFG.

Recommend this book

Email your librarian or administrator to recommend adding this book to your organisation's collection.

Natural Language Parsing
  • Online ISBN: 9780511597855
  • Book DOI: https://doi.org/10.1017/CBO9780511597855
Please enter your name
Please enter a valid email address
Who would you like to send this to *
×