DeSR Dependency Parser |
Among its notable features:
Dependency structures are built scanning the input from left to right and deciding at each step whether to perform a shift or to create a dependency between two adjacent tokens.
DeSR uses though a different set of rules and includes additional rules to handle non-projective dependencies that allow parsing to be performed deterministically in a single pass. The algorithm also produces fully labeled dependency trees.
A classifier is used for learning and predicting the proper parsing action. The parser can be configured, selecting among several learning algorithms (Averaged Perceptron, Maximum Entropy, memory-based learning using TiMBL, support vector machines using libSVM), providing user-defined feature models, and selecting input-output formats (including the CoNLL shared task format).
desr -t -m modelFile trainFile
to produce a model from a training corpus in CoNLL format.
Be careful using option SecondOrder, since it may considerably increase the model size.
desr -m modelFile parseFile > parsedFile
If you plan to use the downloaded model file, first gunzip it.
For a full list of options, type:
desr -h