|本期目录/Table of Contents|

[1]谌志群*,鞠 婷,王 冰.结合树形概率和双向长短期记忆的渐步性句法分析方法[J].厦门大学学报(自然科学版),2019,58(02):243-248.[doi:10.6043/j.issn.0438-0479.201803047]
 CHEN Zhiqun*,JU Ting,WANG Bing.A step-by-step syntactic analysis method based on tree-like probability and bidirectional long short term memory[J].Journal of Xiamen University(Natural Science),2019,58(02):243-248.[doi:10.6043/j.issn.0438-0479.201803047]
点击复制

结合树形概率和双向长短期记忆的渐步性句法分析方法(PDF/HTML)
分享到:

《厦门大学学报(自然科学版)》[ISSN:0438-0479/CN:35-1070/N]

卷:
58卷
期数:
2019年02期
页码:
243-248
栏目:
自然语言处理计算方法
出版日期:
2019-03-27

文章信息/Info

Title:
A step-by-step syntactic analysis method based on tree-like probability and bidirectional long short term memory
文章编号:
0438-0479(2019)02-0243-06
作者:
谌志群*鞠 婷王 冰
杭州电子科技大学认知与智能计算研究所,浙江 杭州 310018
Author(s):
CHEN Zhiqun*JU TingWANG Bing
Institute of Cognitive and Intelligent Computing,Hangzhou Dianzi University,Hangzhou 310018,China
关键词:
树形概率计算方法 双向长短时记忆 渐步性 依存句法分析 句法标签分类
Keywords:
tree-like probability calculation method bidirectional long short term memory step-by-step dependency parsing syntactic label classification
分类号:
TP 391
DOI:
10.6043/j.issn.0438-0479.201803047
文献标志码:
A
摘要:
为有效解决数据的稀疏性问题,并考虑句法预测的内在层次性,提出了一个基于双向长短时记忆(bidirectional long short term memory,BLSTM)神经网络模型的渐步性句法分析模型.该模型将树形概率计算方法应用到对句法标签分类的研究中,利用句法结构和标签之间的层次关系,提出一种从句法结构到句法标签的渐步性句法分析方法,再使用句法分析树来生成句法标签的特征表示,并输入到BLSTM神经网络模型里进行句法标签的分类.在清华大学语义依存语料库上进行实验的结果表明,与链式概率计算方法以及其他依存句法分析器比较,依存准确率提升了0~1个百分点,表明新方法是可行、有效的.
Abstract:
In order to effectively solve the problem of data sparseness and inherent level of syntactic prediction,an incremental stepwise dependency parsing model based on bidirectional long short term memory(BLSTM)is proposed.This paper applying the tree-like probability calculation method to the study of syntactic tag classification,using the hierarchical relationship between syntactic structure and tag,proposes a step-by-step syntactic analysis method from syntactic structure to syntax tag,using syntactic analysis tree to generate the characteristics of the syntactic tag which are input into the BLSTM model to classify syntactic tags.Compared with other syntactic analysis methods and chained probability calculation method on the Semantic Dependency Corpus dataset of Tsinghua University,the dependency accuracy rate is improved by 0-1 percent.It shows that the new method is feasible and effective.

参考文献/References:

[1] 刘海涛.依存语法和机器翻译[J].语言文字应用,1997,23(3):89-93.
[2] HAYS D G.Dependency theory:a formalism and some observations[J].Language,1964,40(4):511-525.
[3] GAIFMAN H.Dependency systems and phrase-structure systems[J].Information and Control,1965,8(3):304-337.
[4] YAMADA H.Statistical dependency analysis with support vector machines[C]∥Proceedings of the 8th International Workshop on Parsing Technologies.Nancy:International Workshop on Parsing Technologies,2003:195-206.
[5] LAI T B Y,HUANG C,ZHOU M,et al.Span-based statistical dependency parsing of chinese[C]∥Proceedings of the 6th Natural Language Processsing Pacific Rim Syposium(NLPRS2001).Tokyo:National Center of Sciences,2001:677-684.
[6] COLLOBERT R.Deep learning for efficient discriminative parsing[C]∥International Conference on Artificical Intelligence and Statistics.Lauderdate:AISTATS,2011:224-232.
[7] CHEN D,MANNING C D.A fast and accurate dependency parser using neural networks[C]∥Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing.Doha:Association for Computational Linguistics,2014:740-750.
[8] DURRETT G,KLEIN D.Neural CRF Parsing[EB/OL].[2018-03-01].http:∥arxiv.org/abs/1507.03641.
[9] MA X,HOVY E.Neural probabilistic model for non-projective MST parsing[C]∥Proceeding of the 8th International Point Conference on Natural Language Processing.Taipei:IJCNLP,2017:59-69.
[10] 王衡军,司念文,宋玉龙,单义栋.结合全局向量特征的神经网络依存句法分析模型[J].通信学报,2018,39(2):53-64.
[11] WANG W,CHANG B.Improved graph-based dependency parsing via hierarchical LSTM networks[C]∥China National Conference on Chinese Computational Linguistics.Yantai:Springer International Publishing,2016:25-32.
[12] ZHANG X,CHENG J,LAPATA M.Dependency parsing as head selection[EB/OL].[2018-03-01].http:∥arxiv.org/pdf/1606.01280.pdf.
[13] 张丹,周俏丽,张桂平.引入层次成分分析的依存句法分析[J].沈阳航空航天大学学报,2017,34(1):76-82.[1] 刘海涛.依存语法和机器翻译[J].语言文字应用,1997,23(3):89-93.
[2] HAYS D G.Dependency theory:a formalism and some observations[J].Language,1964,40(4):511-525.
[3] GAIFMAN H.Dependency systems and phrase-structure systems[J].Information and Control,1965,8(3):304-337.
[4] YAMADA H.Statistical dependency analysis with support vector machines[C]∥Proceedings of the 8th International Workshop on Parsing Technologies.Nancy:International Workshop on Parsing Technologies,2003:195-206.
[5] LAI T B Y,HUANG C,ZHOU M,et al.Span-based statistical dependency parsing of chinese[C]∥Proceedings of the 6th Natural Language Processsing Pacific Rim Syposium(NLPRS2001).Tokyo:National Center of Sciences,2001:677-684.
[6] COLLOBERT R.Deep learning for efficient discriminative parsing[C]∥International Conference on Artificical Intelligence and Statistics.Lauderdate:AISTATS,2011:224-232.
[7] CHEN D,MANNING C D.A fast and accurate dependency parser using neural networks[C]∥Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing.Doha:Association for Computational Linguistics,2014:740-750.
[8] DURRETT G,KLEIN D.Neural CRF Parsing[EB/OL].[2018-03-01].http:∥arxiv.org/abs/1507.03641.
[9] MA X,HOVY E.Neural probabilistic model for non-projective MST parsing[C]∥Proceeding of the 8th International Point Conference on Natural Language Processing.Taipei:IJCNLP,2017:59-69.
[10] 王衡军,司念文,宋玉龙,单义栋.结合全局向量特征的神经网络依存句法分析模型[J].通信学报,2018,39(2):53-64.
[11] WANG W,CHANG B.Improved graph-based dependency parsing via hierarchical LSTM networks[C]∥China National Conference on Chinese Computational Linguistics.Yantai:Springer International Publishing,2016:25-32.
[12] ZHANG X,CHENG J,LAPATA M.Dependency parsing as head selection[EB/OL].[2018-03-01].http:∥arxiv.org/pdf/1606.01280.pdf.
[13] 张丹,周俏丽,张桂平.引入层次成分分析的依存句法分析[J].沈阳航空航天大学学报,2017,34(1):76-82.

备注/Memo

备注/Memo:
收稿日期:2018-03-22 录用日期:2018-08-06
*通信作者:chenzq@hdu.edu.cn
更新日期/Last Update: 1900-01-01