Syntax-aware entity representations for neural relation extraction
DOI10.1016/J.ARTINT.2019.07.004zbMATH Open1478.68288OpenAlexW2963866108WikidataQ127489820 ScholiaQ127489820MaRDI QIDQ2321344FDOQ2321344
Authors: Zhengqiu He, Zhenghua Li, Wei Zhang, Hao Shao, Min Zhang, Wen-Liang Chen
Publication date: 28 August 2019
Published in: Artificial Intelligence (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.artint.2019.07.004
Recommendations
- Relation extraction via distant supervision technology
- Bias modeling for distantly supervised relation extraction
- Survey of entity relationship extraction based on deep learning
- Extending training set in distant supervision by ontology for relation extraction
- Feature assembly method for extracting relations in Chinese
Learning and adaptive systems in artificial intelligence (68T05) Natural language processing (68T50)
Cites Work
Cited In (9)
- Exploring pattern structures of syntactic trees for relation extraction
- Survey of entity relationship extraction based on deep learning
- Providing definitive learning direction for relation classification system
- Bias modeling for distantly supervised relation extraction
- Extending training set in distant supervision by ontology for relation extraction
- Relation extraction via distant supervision technology
- Corpus-level fine-grained entity typing
- A hybrid tree structured neural network for implicit discourse relation recognition
- Encoding implicit relation requirements for relation extraction: a joint inference approach
Uses Software
This page was built for publication: Syntax-aware entity representations for neural relation extraction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2321344)