Skip to main content Accessibility help
×
Home
Hostname: page-component-99c86f546-8r8mm Total loading time: 0.193 Render date: 2021-11-28T15:51:40.300Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true, "newUsageEvents": true }

Unsupervised dependency parsing without training

Published online by Cambridge University Press:  14 March 2012

ANDERS SØGAARD*
Affiliation:
Center for Language Technology University of Copenhagen Njalsgade 142 DK-2300 Copenhagen S, Denmark e-mail: soegaard@hum.ku.dk

Abstract

Usually unsupervised dependency parsers try to optimize the probability of a corpus by revising the dependency model that is assumed to have generated the corpus. In this paper we explore a different view in which a dependency structure is, among other things, a partial order on the nodes in terms of centrality or saliency. Under this assumption we directly model centrality and derive dependency trees from the ordering of words. The result is an approach to unsupervised dependency parsing that is very different from standard ones in that it requires no training data. The input words are ordered by centrality, and a parse is derived from the ranking using a simple deterministic parsing algorithm, relying on the universal dependency rules defined by Naseem et al. (Naseem, T., Chen, H., Barzilay, R., Johnson, M. 2010. Using universal linguistic knowledge to guide grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Boston, MA, USA, pp. 1234–44.). Our approach is evaluated on data from twelve different languages and is remarkably competitive.

Type
Articles
Copyright
Copyright © Cambridge University Press 2012

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Agirre, E., and Soroa, A. 2009. Personalizing pagerank for word sense disambiguation. In Proceedings of the European Chapter of the Association for Computational Linguistics, Athens, Greece, pp. 3341.Google Scholar
Brin, S., and Page, L. 1998. The anatomy of a large-scale hypertextual web search engine. In Proceedings of the International Web Conference, Brisbane, Australia, pp. 107–17.Google Scholar
Brody, S. 2010. It depends on the translation: unsupervised dependency parsing via word alignment. In Proceedings of the Empirical Methods in Natural Language Processing, Boston, MA, pp. 1214–22.Google Scholar
Buchholz, S., and Marsi, E. 2006. CoNLL-X shared task on multilingual dependency parsing. In Proceedings of the Computational Natural Language Learning, New York City, NY, USA, pp. 149–64.Google Scholar
Cohen, S., Das, D., and Smith, N. 2011a. Unsupervised structure prediction with non-parallel multilingual guidance. In Proceedings of the Empirical Methods in Natural Language Processing, Edinburgh, Scotland, pp. 5061.Google Scholar
Cohen, S., Gimpel, K., and Smith, N. 2008. Logistic normal priors for unsupervised probabilistic grammar induction. In Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems (NIPS), Vancouver, BC, Canada, December 8–11, pp. 321–8.Google Scholar
Cohen, S., Rodriguez, C., and Satta, G. 2011b. Exact inference for generative probabilistic non-projective dependency parsing. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1234–45.Google Scholar
Druck, G., Mann, G., and McCallum, A. 2009. Semi-supervised learning of dependency parsers using generalized expectation criteria. In Proceedings of Association for Computational Linguistics, Singapore, pp. 360–8.Google Scholar
Eisner, J., and Smith, N. A. 2005. Parsing with soft and hard constraints on dependency length. In Proceedings of International Conference on Parsing Technologies, Vancouver, BC, Canada, pp. 3041.Google Scholar
Ganesan, K., Zhai, C., and Han, J. 2010. Opinosis: a graph-based approach to abstractive summarization of highly redudant opinions. In Proceedings of International Conference of Computational Linguistics, Beijing, China, pp. 340–8.Google Scholar
Gillenwater, J., Ganchev, K., Graca, J., Pereira, F., and Taskar, B. 2010. Sparsity in dependency grammar induction. In Proceedings of Association for Computational Linguistics, Uppsala, Sweden, pp. 194–9.Google Scholar
Headden, W., Johnson, M., and McClosky, D. 2009. Improving unsupervised dependency parsing with richer contexts and smoothing. In Proceedings of North American Chapter of the Association for Computational Linguistics, Boulder, CO, USA, pp. 101–9.Google Scholar
Klein, D., and Manning, C. 2004. Corpus-based induction of syntactic structure: models of dependency and constituency. In: Proceedings of Association for Computational Linguistics, Barcelona, Spain, pp. 478–85.Google Scholar
Maier, W., and Søgaard, A. 2008. Treebanks and mild context-sensitivity. In Formal Grammar, Hamburg, Germany, pp. 6176.Google Scholar
Marcus, M., Marcinkiewicz, M., and Santorini, B. 1993. Building a large annotated corpus of English: the Penn Treebank. Computational Linguistics 19 (2): 313–30.Google Scholar
McDonald, R., Pereira, F., Ribarov, , and , K., Hajič, J. 2005. Non-projective dependency parsing using spanning tree algorithms. In Proceedings of Empirical Methods in Natural Language Processing, Vancouver, BC, Canada, pp. 523–30.Google Scholar
McDonald, R., Petrov, S., and Hall, K. 2011. Multi-source transfer of delexicalized dependency parsers. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 6272.Google Scholar
McDonald, R., and Satta, G. 2007. On the complexity of non-projective data-driven dependency parsing. In Proceedings of International Conference on Parsing Technologies, Prague, Czech Republic, pp. 121–32.Google Scholar
Mihalcea, R., and Tarau, P. 2004. Textrank: bringing order into texts. In Proceedings of Empirical Methods in Natural Language Processing, Barcelona, Spain, pp. 404–11.Google Scholar
Naseem, T., Chen, H., Barzilay, R., and Johnson, M. 2010. Using universal linguistic knowledge to guide grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Boston, MA, USA, pp. 1234–44.Google Scholar
Nivre, J. 2009. Non-projective dependency parsing in expected linear time. In Proceedings of Association for Computational Linguistics, Singapore, pp. 351–9.Google Scholar
Petrov, S., Das, D., and McDonald, R. 2011. A universal part-of-speech tagset. CoRR abs/1104.2086.Google Scholar
Schwartz, R., Abend, O., Reichart, R., and Rappoport, A. 2011. Neutralizing linguistically problematic annotations in unsupervised dependency parsing evaluation. In Proceedings of Association for Computational Linguistics, Portland, OR, USA, pp. 663–72.Google Scholar
Seginer, Y. 2007. Fast unsupervised incremental parsing. In Proceedings of Association for Computational Linguistics, Prague, Czech Republic, pp. 384–91.Google Scholar
Smith, N., and Eisner, J. 2005. Contrastive estimation: training log-linear models on unlabeled data. In Proceedings of Association for Computational Linguistics, Ann Arbor, MI, pp. 354–62.Google Scholar
Smith, N., and Eisner, J. 2006. Annealing structural bias in multilingual weighted grammar induction. In Proceedings of Association for Computational Linguistics, Sydney, Australia, pp. 569–76.Google Scholar
Smith, D., and Eisner, J. 2009. Parser adaptation and projection with quasi-synchronous grammar features. In Proceedings of Empirical Methods in Natural Language Processing, Singapore, pp. 822–31.Google Scholar
Søgaard, A. 2011. Data point selection for cross-language adaptation of dependency parsers. In Proceedings of Association for Computational Linguistics, Portland, OR, USA, pp. 682–6.Google Scholar
Spitkovsky, V., Alshawi, H., Chang, A., and Jurafsky, D. 2011a. Unsupervised dependency parsing without gold part-of-speech tags. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1281–90.Google Scholar
Spitkovsky, V., Alshawi, H., and Jurafsky, D. 2009. Baby steps: how “less is more” in unsupervised dependency parsing. In Proceedings of NIPS Workshop on Grammar Induction, Representation of Language and Language Learning, Whistler, BC, Canada, pp. 19.Google Scholar
Spitkovsky, V., Alshawi, H., and Jurafsky, D. 2011b. Lateen EM: unsupervised training with multiple objectives applied to dependency grammar induction. In Proceedings of Empirical Methods in Natural Language Processing, Edinburgh, Scotland, UK, pp. 1269–80.Google Scholar
Spreyer, K., and Kuhn, J. 2009. Data-driven dependency parsing of new languages using incomplete and noisy training data. In Proceedings of Computational Natural Language Learning, Boulder, CO, USA, pp. 1220.Google Scholar
1
Cited by

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Unsupervised dependency parsing without training
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Unsupervised dependency parsing without training
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Unsupervised dependency parsing without training
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *