Pre-trained language model augmented adversarial training network for Chinese clinical event detection
DOI10.3934/MBE.2020157zbMATH Open1467.92103OpenAlexW3014020538WikidataQ100312391 ScholiaQ100312391MaRDI QIDQ2038656FDOQ2038656
Authors: Zhichang Zhang, Minyu Zhang, Tong Zhou, Yanlong Qiu
Publication date: 7 July 2021
Published in: Mathematical Biosciences and Engineering (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/mbe.2020157
Recommendations
- Entity recognition of Chinese medical text based on multi-head self-attention combined with BILSTM-CRF
- A self-attention based neural architecture for Chinese medical named entity recognition
- Joint Chinese event extraction based multi-task learning
- Cross-domain Chinese hedge cue detection based on shared representations
- Fast and effective biomedical named entity recognition using temporal convolutional network with conditional random field
transfer learningclass imbalance problemsemantic understandingadversarial training networkChinese clinical event detectionChinese clinical narrativesmedical artificial intelligencepre-trained language model
Cited In (4)
- Cross-domain Chinese hedge cue detection based on shared representations
- Entity recognition of Chinese medical text based on multi-head self-attention combined with BILSTM-CRF
- A self-attention based neural architecture for Chinese medical named entity recognition
- Biomedical and health information processing and analysis
Uses Software
This page was built for publication: Pre-trained language model augmented adversarial training network for Chinese clinical event detection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2038656)