![]() Sangati, F., Zuidema, W.: Unsupervised methods for head assignments. Create a TregexPattern from this tregex string using the headFinder and basicCat function this. To prepare the data and train the parsers, we converted au. John Wiley, Chichester (1997)Ĭhiang, D., Recovering, D.B.: latent information in treebanks. To adapt the Stanford parser for Bulgarian, a head finder table is provided for the parser. You can see an example of this kind of work in the TreeAnnotator within the parser. Use a HeadFinder (if you're parsing English, the CollinsHeadFinder) to retrieve the head word / head constituent at each node. pig wrestlingi lied stanford interviewerDad video eating peanutsamber ham. 1 You can build this using the TreeTransformer interface. Gen, M., Cheng, R.: Genetic Algorithms and Engineering Design. alpha brown paper baghead finder twittercreekside middle school patterson. Thollard, F., Dupont, P., de la Higuera, C.: Probabilistic DFA inference using Kullback-Leibler divergence and minimality. Marcus, M., Santorini, B.: Building a large annotated corpus of English: The Penn treebank. Nivre, J.A.A.: Maltparser: A language-independent system for data-driven dependency parsing. Klein, D., Manning, C.: Distributional phrase structure induction. ![]() In: ACL 1997 (1997)Įisner, J.: Bilexical grammars and a cubictime probabilistic parser. PhD thesis, University of Pennsylvania (2004)Ĭollins, M.: Three generative, lexicalized models for statistical parsing. 41st ACL (2003)īikel, D.: On the Parameter Space of Generative Lexicalized Statistical Parsing Models. Klein, D., Manning, C.: Accurate unlexicalized parsing. For a given constituent we perform operations like (this is for 'left' or 'right': for categoryList in categoryLists for index 1 to n or n to 1 if R->L for category in categoryList if category equals daughterindex choose it. thesis, Stanford University (1994)Ĭharniak, E.: A maximum-entropy-inspired parser. A base class for a HeadFinder similar to the one described in Michael Collins' 1999 thesis. Magerman, D.M.: Natural language parsing as statistical pattern recognition. This process is experimental and the keywords may be updated as the learning algorithm improves. These keywords were added by machine and not by the authors. We conclude that, although we obtain some statistically significant improvements using the optimal head finder, the experiments with random head finders show that random changes in head finder algorithms do not impact dramatically the performance of parsers. We also present a series of experiments with random head finders. In this paper, we present an optimization set-up that tries to produce a head finder algorithm that is optimal for parsing. Head finders usually have been inspired on linguistic bases and they have been used by parsers as such. ![]() For the same phrase structure tree, different head finders produce different dependency trees. Head finder algorithms are used by supervised parsers during their training phase to transform phrase structure trees into dependency ones. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |