Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training
Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training
Xinyu Wang Kewei Tu

Abstract
In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| dependency-parsing-on-chinese-treebank | MFVI | LAS: 91.69 UAS: 92.78 |
| dependency-parsing-on-penn-treebank | MFVI | LAS: 95.34 UAS: 96.91 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.