Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Tytuł pozycji:

A multi-layer soft lattice based model for Chinese clinical named entity recognition.

Tytuł:
A multi-layer soft lattice based model for Chinese clinical named entity recognition.
Autorzy:
Guo S; State Key Laboratory of Intelligent Control and Decision of Complex Systems, School of Automation, Beijing Institute of Technology, Beijing, China.
Yang W; State Key Laboratory of Intelligent Control and Decision of Complex Systems, School of Automation, Beijing Institute of Technology, Beijing, China.
Han L; Department of Cardiology, The Second Medical Center, National Clinical Research Center for Geriatric Diseases, Chinese PLA General Hospital, Beijing, China. .
Song X; State Key Laboratory of Intelligent Control and Decision of Complex Systems, School of Automation, Beijing Institute of Technology, Beijing, China.
Wang G; State Key Laboratory of Intelligent Control and Decision of Complex Systems, School of Automation, Beijing Institute of Technology, Beijing, China.
Źródło:
BMC medical informatics and decision making [BMC Med Inform Decis Mak] 2022 Jul 30; Vol. 22 (1), pp. 201. Date of Electronic Publication: 2022 Jul 30.
Typ publikacji:
Journal Article; Research Support, Non-U.S. Gov't
Język:
English
Imprint Name(s):
Original Publication: London : BioMed Central, [2001-
MeSH Terms:
Electronic Health Records*
Natural Language Processing*
China ; Humans
References:
Bioinformatics. 2019 May 15;35(10):1745-1752. (PMID: 30307536)
Int J Med Inform. 2019 Sep;129:100-106. (PMID: 31445243)
J Biomed Inform. 2013 Dec;46(6):1088-98. (PMID: 23954592)
Int J Med Inform. 2019 Dec;132:103985. (PMID: 31627032)
J Biomed Inform. 2019;100S:100057. (PMID: 34384583)
AMIA Jt Summits Transl Sci Proc. 2019 May 06;2019:761-770. (PMID: 31259033)
Drug Saf. 2019 Jan;42(1):113-122. (PMID: 30649736)
J Biomed Inform. 2020 Mar;103:103384. (PMID: 32032717)
Bioinformatics. 2021 Jul 19;37(12):1739-1746. (PMID: 33098410)
Database (Oxford). 2019 Jan 1;2019:. (PMID: 31125403)
JMIR Med Inform. 2019 Sep 12;7(3):e14830. (PMID: 31516126)
IEEE Trans Nanobioscience. 2019 Jul;18(3):306-315. (PMID: 30946674)
BMC Bioinformatics. 2019 May 29;20(Suppl 10):249. (PMID: 31138109)
J Healthc Eng. 2020 Nov 24;2020:8829219. (PMID: 33299537)
J Biomed Inform. 2020 Jul;107:103422. (PMID: 32353595)
J Am Med Inform Assoc. 2020 Jan 1;27(1):39-46. (PMID: 31390003)
Bioinformatics. 2020 Feb 15;36(4):1234-1240. (PMID: 31501885)
Contributed Indexing:
Keywords: Clinical named entity recognition; Clinical text mining; Fine-tuning BERT; Medical information processing; Transformer; Word-character lattice
Entry Date(s):
Date Created: 20220730 Date Completed: 20220802 Latest Revision: 20220805
Update Code:
20240104
PubMed Central ID:
PMC9338545
DOI:
10.1186/s12911-022-01924-4
PMID:
35908055
Czasopismo naukowe
Objective: Named entity recognition (NER) is a key and fundamental part of many medical and clinical tasks, including the establishment of a medical knowledge graph, decision-making support, and question answering systems. When extracting entities from electronic health records (EHRs), NER models mostly apply long short-term memory (LSTM) and have surprising performance in clinical NER. However, increasing the depth of the network is often required by these LSTM-based models to capture long-distance dependencies. Therefore, these LSTM-based models that have achieved high accuracy generally require long training times and extensive training data, which has obstructed the adoption of LSTM-based models in clinical scenarios with limited training time.
Method: Inspired by Transformer, we combine Transformer with Soft Term Position Lattice to form soft lattice structure Transformer, which models long-distance dependencies similarly to LSTM. Our model consists of four components: the WordPiece module, the BERT module, the soft lattice structure Transformer module, and the CRF module.
Result: Our experiments demonstrated that this approach increased the F1 by 1-5% in the CCKS NER task compared to other models based on LSTM with CRF and consumed less training time. Additional evaluations showed that lattice structure transformer shows good performance for recognizing long medical terms, abbreviations, and numbers. The proposed model achieve 91.6% f-measure in recognizing long medical terms and 90.36% f-measure in abbreviations, and numbers.
Conclusions: By using soft lattice structure Transformer, the method proposed in this paper captured Chinese words to lattice information, making our model suitable for Chinese clinical medical records. Transformers with Mutilayer soft lattice Chinese word construction can capture potential interactions between Chinese characters and words.
(© 2022. The Author(s).)
Zaloguj się, aby uzyskać dostęp do pełnego tekstu.

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies