Skip to main content
×
×
Home

Unsupervised dependency parsing without training

  • ANDERS SØGAARD (a1)
Abstract

Usually unsupervised dependency parsers try to optimize the probability of a corpus by revising the dependency model that is assumed to have generated the corpus. In this paper we explore a different view in which a dependency structure is, among other things, a partial order on the nodes in terms of centrality or saliency. Under this assumption we directly model centrality and derive dependency trees from the ordering of words. The result is an approach to unsupervised dependency parsing that is very different from standard ones in that it requires no training data. The input words are ordered by centrality, and a parse is derived from the ranking using a simple deterministic parsing algorithm, relying on the universal dependency rules defined by Naseem et al. (Naseem, T., Chen, H., Barzilay, R., Johnson, M. 2010. Using universal linguistic knowledge to guide grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Boston, MA, USA, pp. 1234–44.). Our approach is evaluated on data from twelve different languages and is remarkably competitive.

Copyright
References
Hide All
Agirre, E., and Soroa, A. 2009. Personalizing pagerank for word sense disambiguation. In Proceedings of the European Chapter of the Association for Computational Linguistics, Athens, Greece, pp. 3341.
Brin, S., and Page, L. 1998. The anatomy of a large-scale hypertextual web search engine. In Proceedings of the International Web Conference, Brisbane, Australia, pp. 107–17.
Brody, S. 2010. It depends on the translation: unsupervised dependency parsing via word alignment. In Proceedings of the Empirical Methods in Natural Language Processing, Boston, MA, pp. 1214–22.
Buchholz, S., and Marsi, E. 2006. CoNLL-X shared task on multilingual dependency parsing. In Proceedings of the Computational Natural Language Learning, New York City, NY, USA, pp. 149–64.
Cohen, S., Das, D., and Smith, N. 2011a. Unsupervised structure prediction with non-parallel multilingual guidance. In Proceedings of the Empirical Methods in Natural Language Processing, Edinburgh, Scotland, pp. 5061.
Cohen, S., Gimpel, K., and Smith, N. 2008. Logistic normal priors for unsupervised probabilistic grammar induction. In Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems (NIPS), Vancouver, BC, Canada, December 8–11, pp. 321–8.
Cohen, S., Rodriguez, C., and Satta, G. 2011b. Exact inference for generative probabilistic non-projective dependency parsing. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1234–45.
Druck, G., Mann, G., and McCallum, A. 2009. Semi-supervised learning of dependency parsers using generalized expectation criteria. In Proceedings of Association for Computational Linguistics, Singapore, pp. 360–8.
Eisner, J., and Smith, N. A. 2005. Parsing with soft and hard constraints on dependency length. In Proceedings of International Conference on Parsing Technologies, Vancouver, BC, Canada, pp. 3041.
Ganesan, K., Zhai, C., and Han, J. 2010. Opinosis: a graph-based approach to abstractive summarization of highly redudant opinions. In Proceedings of International Conference of Computational Linguistics, Beijing, China, pp. 340–8.
Gillenwater, J., Ganchev, K., Graca, J., Pereira, F., and Taskar, B. 2010. Sparsity in dependency grammar induction. In Proceedings of Association for Computational Linguistics, Uppsala, Sweden, pp. 194–9.
Headden, W., Johnson, M., and McClosky, D. 2009. Improving unsupervised dependency parsing with richer contexts and smoothing. In Proceedings of North American Chapter of the Association for Computational Linguistics, Boulder, CO, USA, pp. 101–9.
Klein, D., and Manning, C. 2004. Corpus-based induction of syntactic structure: models of dependency and constituency. In: Proceedings of Association for Computational Linguistics, Barcelona, Spain, pp. 478–85.
Maier, W., and Søgaard, A. 2008. Treebanks and mild context-sensitivity. In Formal Grammar, Hamburg, Germany, pp. 6176.
Marcus, M., Marcinkiewicz, M., and Santorini, B. 1993. Building a large annotated corpus of English: the Penn Treebank. Computational Linguistics 19 (2): 313–30.
McDonald, R., Pereira, F., Ribarov, , and , K., Hajič, J. 2005. Non-projective dependency parsing using spanning tree algorithms. In Proceedings of Empirical Methods in Natural Language Processing, Vancouver, BC, Canada, pp. 523–30.
McDonald, R., Petrov, S., and Hall, K. 2011. Multi-source transfer of delexicalized dependency parsers. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 6272.
McDonald, R., and Satta, G. 2007. On the complexity of non-projective data-driven dependency parsing. In Proceedings of International Conference on Parsing Technologies, Prague, Czech Republic, pp. 121–32.
Mihalcea, R., and Tarau, P. 2004. Textrank: bringing order into texts. In Proceedings of Empirical Methods in Natural Language Processing, Barcelona, Spain, pp. 404–11.
Naseem, T., Chen, H., Barzilay, R., and Johnson, M. 2010. Using universal linguistic knowledge to guide grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Boston, MA, USA, pp. 1234–44.
Nivre, J. 2009. Non-projective dependency parsing in expected linear time. In Proceedings of Association for Computational Linguistics, Singapore, pp. 351–9.
Petrov, S., Das, D., and McDonald, R. 2011. A universal part-of-speech tagset. CoRR abs/1104.2086.
Schwartz, R., Abend, O., Reichart, R., and Rappoport, A. 2011. Neutralizing linguistically problematic annotations in unsupervised dependency parsing evaluation. In Proceedings of Association for Computational Linguistics, Portland, OR, USA, pp. 663–72.
Seginer, Y. 2007. Fast unsupervised incremental parsing. In Proceedings of Association for Computational Linguistics, Prague, Czech Republic, pp. 384–91.
Smith, N., and Eisner, J. 2005. Contrastive estimation: training log-linear models on unlabeled data. In Proceedings of Association for Computational Linguistics, Ann Arbor, MI, pp. 354–62.
Smith, N., and Eisner, J. 2006. Annealing structural bias in multilingual weighted grammar induction. In Proceedings of Association for Computational Linguistics, Sydney, Australia, pp. 569–76.
Smith, D., and Eisner, J. 2009. Parser adaptation and projection with quasi-synchronous grammar features. In Proceedings of Empirical Methods in Natural Language Processing, Singapore, pp. 822–31.
Søgaard, A. 2011. Data point selection for cross-language adaptation of dependency parsers. In Proceedings of Association for Computational Linguistics, Portland, OR, USA, pp. 682–6.
Spitkovsky, V., Alshawi, H., Chang, A., and Jurafsky, D. 2011a. Unsupervised dependency parsing without gold part-of-speech tags. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1281–90.
Spitkovsky, V., Alshawi, H., and Jurafsky, D. 2009. Baby steps: how “less is more” in unsupervised dependency parsing. In Proceedings of NIPS Workshop on Grammar Induction, Representation of Language and Language Learning, Whistler, BC, Canada, pp. 19.
Spitkovsky, V., Alshawi, H., and Jurafsky, D. 2011b. Lateen EM: unsupervised training with multiple objectives applied to dependency grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1269–80.
Spreyer, K., and Kuhn, J. 2009. Data-driven dependency parsing of new languages using incomplete and noisy training data. In Proceedings of Computational Natural Language Learning, Boulder, CO, USA, pp. 1220.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Natural Language Engineering
  • ISSN: 1351-3249
  • EISSN: 1469-8110
  • URL: /core/journals/natural-language-engineering
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed