Published in

Association for Computing Machinery (ACM), ACM Transactions on Asian and Low-Resource Language Information Processing, 2(17), p. 1-21, 2018

DOI: 10.1145/3152537

Links

Tools

Export citation

Search in Google Scholar

Improved Discourse Parsing with Two-Step Neural Transition-Based Model

Journal article published in 2018 by Yanyan Jia, Yansong Feng, Yuan Ye, Chao Lv, Chongde Shi, Dongyan Zhao
This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Discourse parsing aims to identify structures and relationships between different discourse units. Most existing approaches analyze a whole discourse at once, which often fails in distinguishing long-span relations and properly representing discourse units. In this article, we propose a novel parsing model to analyze discourse in a two-step fashion with different feature representations to characterize intra sentence and inter sentence discourse structures, respectively. Our model works in a transition-based framework and benefits from a stack long short-term memory neural network model. Experiments on benchmark tree banks show that our method outperforms traditional 1-step parsing methods in both English and Chinese.