AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
We proposed a novel bidirectional neural network bidirectional recurrent convolutional neural networks, to improve the performance of relation classification

Bidirectional Recurrent Convolutional Neural Network For Relation Classification

PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, (2016): 756-765

Cited: 159|Views76
EI
Full Text
Bibtex
Weibo

Abstract

Relation classification is an important semantic processing task in the field of natural language processing (NLP). In this paper, we present a novel model BRCNN to classify the relation of two entities in a sentence. Some state-of-the-art systems concentrate on modeling the shortest dependency path (SDP) between two entities leveraging c...More

Code:

Data:

0
Introduction
  • Relation classification aims to classify the semantic relations between two entities in a sentence.
  • In the sentence “The [burst]e1 has been caused by water hammer [pressure]e2”, entities burst and pressure are of relation CauseEffect(e2, e1).
  • Deep learning techniques have made significant improvement in relation classification, Recently, more attentions have been paid to modeling the shortest dependency path (SDP) of sentences.
  • Liu et al (2015) developed a dependency-based neural network, in which a convolutional neural network has been used to capture features on the shortest path and a recursive neural network is designed to model subtrees.
Highlights
  • Relation classification aims to classify the semantic relations between two entities in a sentence
  • Our first contribution is that we propose a recurrent convolutional neural network (RCNN) to encode the global pattern in shortest dependency path (SDP) utilizing a two-channel long short term memory (LSTM) based recurrent neural network and capture local features of every two neighbor words linked by a dependency relation utilizing a convolution layer
  • Our second contribution is that we propose a bidirectional recurrent convolutional neural networks (BRCNN) to learn representations with bidirectional information along the SDP forwards and backwards at the same time, which strengthen the ability to classifying directions of relationships between entities
  • We make use of three types of information to improve the performance of BRCNN: POS tags, NER features and WordNet hypernyms
  • We proposed a novel bidirectional neural network BRCNN, to improve the performance of relation classification
  • RCNN achieves a better performance at learning features along the shortest dependency path, compared with some common neural networks
Methods
  • The authors evaluated the BRCNN model on the SemEval2010 Task 8 dataset, which is an established benchmark for relation classification (Hendrickx et al, 2010).
  • The dataset contains 8000 sentences for training, and 2717 for testing.
  • The authors split 800 samples out of the training set for validation.
  • POS, WordNet, Prefixes and other morphological features, dependency parse, Levin classed, PropBank, FanmeNet, 82.2.
  • (Rink and Harabagiu, 2010) NomLex-Plus, Google n-gram, paraphrases, TextRunner RNN.
Results
Download tables as Excel
Related work
  • Relation classification is an important topic in NLP. Traditional Methods for relation classification mainly fall into three classes: feature-based, kernel-based and neural network-based.

    In feature-based approaches, different types of features are extracted and fed into a classifier. Generally, three types of features are often used. Lexical features concentrate on the entities of interest, e.g., POS. Syntactic features include chunking, parse trees, etc. Semantic features are exemplified by the concept hierarchy, entity class. Kambhatla (2004) used a maximum entropy model for feature combination. Rink and Harabagiu (2010) collected various features, including lexical, syntactic as well as semantic features.
Funding
  • Our work is supported by National Natural Science Foundation of China (No.61370117 & No 61433015) and Major National Social Science Fund of China (No 12&ZD227)
Reference
  • Razvan C Bunescu and Raymond J Mooney. 2005. A shortest path dependency kernel for relation extraction. In Proceedings of the conference on Human Language, pages 724–731.
    Google ScholarLocate open access versionFindings
  • Cıcero Nogueira dos Santos, Bing Xiang, and Bowen Zhou. 2015. Classifying relations by ranking with convolutional neural networks. In In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference, pages 626–634.
    Google ScholarLocate open access versionFindings
  • Kazuma Hashimoto, Makoto Miwa, Yoshimasa Tsuruoka, and Takashi Chikayama. 201Simple customization of recursive neural networks for semantic relation classification. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1372–1376.
    Google ScholarLocate open access versionFindings
  • Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid O Seaghdha, Sebastian Pado, Marco Pennacchiotti, Lorenza Romano, and Stan Szpakowicz. 2010. Semeval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals. In Proceedings of the Workshop on Semantic Evaluations: Recent Achievements and Future Directions. Association for Computational Linguistics, pages pages 94–99.
    Google ScholarLocate open access versionFindings
  • Sepp Hochreiter and Jurgen Schmidhuber. 1997. Long short-term memory. In Neural computation, pages 1735–1780.
    Google ScholarLocate open access versionFindings
  • Nanda Kambhatla. 2004. Combining lexical, syntactic, and semantic features with maximum entropy models for extracting relations. In Proceedings of the ACL 2004 on Interactive poster and demonstration sessions, page 22. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Yang Liu, Furu Wei, Sujian Li, Heng Ji, Ming Zhou, and Houfeng Wang. 2015. A dependency-based neural network for relation classification. In Proceedings of the 53rd Annual Meeting of the Association for Computational Joint Conference on Natural Language Processing and the 7th International Joint Conference on Natural Language Processing, pages 285–290.
    Google ScholarLocate open access versionFindings
  • Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S Corrado, and Jeff Dean. 2013. Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems, pages 3111–3119.
    Google ScholarLocate open access versionFindings
  • Barbara Plank and Alessandro Moschitti. 2013. Embeddings semantic similarity in tree kernels for domain adaption of relation extraction. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, pages 1498–1507.
    Google ScholarLocate open access versionFindings
  • Bryan Rink and Sanda Harabagiu. 20Utd: Classifying semantic relations by combining lexical and semantic resources. In Proceedings of the 5th International Workshop on Semantic Evaluation, pages 256–259. Association for Computational Linguistics.
    Google ScholarLocate open access versionFindings
  • Richard Socher, Jeffrey Pennington, Eric H Huang, Andrew Y Ng, and Christopher D Manning. 20Semi-supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pages 151–161.
    Google ScholarLocate open access versionFindings
  • Richard Socher, Brody Huval, Christopher D Manning, and Andrew Y Ng. 20Semantic compositionality through recursive matrix-vector spaces. In Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pages 1201–1211.
    Google ScholarLocate open access versionFindings
  • Mengqiu Wang. 2008. A re-examination of dependency path kernels for relation extraction. In Proceedings of the Third International Joint Conference on Natural Language Processing, pages 841–846.
    Google ScholarLocate open access versionFindings
  • Kun Xu, Yansong Feng, Songfang Huang, and Dongyan Zhao. 2015a. Semantic relation classification via convolutional neural networks with simple negative sampling. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pages 536–540.
    Google ScholarLocate open access versionFindings
  • Yan Xu, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, and Zhi Jin. 2015b. Classifying relations via long short term memory networks along shortest dependency paths. In Proceedings of Conference on Empirical Methods in Natural Language Processing,, pages 1785–1794.
    Google ScholarLocate open access versionFindings
  • Mo Yu, Matthew Gormley, and Mark Dredze. 2014. Factor-based compositional embedding models. In NIPS Workshop on Learning Semantics, pages 95– 101.
    Google ScholarLocate open access versionFindings
  • Wojciech Zaremba and Ilya Sutskever. 2014. Learning to execute. arXiv preprint arXiv:1410.4615.
    Findings
  • Mathew D. Zeiler. 2012. An adaptive learning rate method. In arXiv preprint at arXiv:1212.5701.
    Findings
  • Dmitry Zelenko, Chinatsu Aone, and Anthony Richardella. 2003. Kernel methods for relation extraction. The Journal of Machine Learning Research, 3:1083–1106.
    Google ScholarLocate open access versionFindings
  • Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, Jun Zhao, et al. 2014. Relation classification via convolutional deep neural network. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pages 2335–2344.
    Google ScholarLocate open access versionFindings
0
Your rating :

No Ratings

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn