Relation Extraction in Biomedical Texts Based on Multi-Head Attention Model With Syntactic Dependency Feature Modeling Study

Yongbin, Li and Stephanie, Chua (2022) Relation Extraction in Biomedical Texts Based on Multi-Head Attention Model With Syntactic Dependency Feature Modeling Study. JMIR Medical Informatics, 10 (10). ISSN 2291-9694

[img] PDF
Relation Extraction - Copy.pdf

Download (19kB)
Official URL: https://medinform.jmir.org/2022/10/e41136

Abstract

Background: With the rapid expansion of biomedical literature, biomedical information extraction has attracted increasing attention from researchers. In particular, relation extraction between 2 entities is a long-term research topic. Objective: This study aimed to perform 2 multiclass relation extraction tasks of Biomedical Natural Language Processing Workshop 2019 Open Shared Tasks: relation extraction of Bacteria-Biotope (BB-rel) task and binary relation extraction of plant seed development (SeeDev-binary) task. In essence, these 2 tasks are aimed at extracting the relation between annotated entity pairs from biomedical texts, which is a challenging problem. Methods: Traditional research methods adopted feature- or kernel-based methods and achieved good performance. For these tasks, we propose a deep learning model based on a combination of several distributed features, such as domain-specific word embedding, part-of-speech embedding, entity-type embedding, distance embedding, and position embedding. The multi-head attention mechanism is used to extract the global semantic features of an entire sentence. Meanwhile, we introduced a dependency-type feature and the shortest dependency path connecting 2 candidate entities in the syntactic dependency graph to enrich the feature representation. Results: Experiments show that our proposed model has excellent performance in biomedical relation extraction, achieving F1 scores of 65.56% and 38.04% on the test sets of the BB-rel and SeeDev-binary tasks. Especially in the SeeDev-binary task, the F1 score of our model is superior to that of other existing models and achieves state-of-the-art performance. Conclusions: We demonstrated that the multi-head attention mechanism can learn relevant syntactic and semantic features in different representation subspaces and different positions to extract comprehensive feature representation. Moreover, syntactic dependency features can improve the performance of the model by learning dependency relation between the entities in biomedical texts.

Item Type: Article
Uncontrolled Keywords: biomedical relation extraction; deep learning; feature combination; multi-head attention; additive attention; syntactic dependency feature; syntactic dependency graph; shortest dependency path.
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Academic Faculties, Institutes and Centres > Faculty of Computer Science and Information Technology
Faculties, Institutes, Centres > Faculty of Computer Science and Information Technology
Academic Faculties, Institutes and Centres > Faculty of Computer Science and Information Technology
Depositing User: Hui Li
Date Deposited: 27 Oct 2022 06:23
Last Modified: 27 Oct 2022 06:23
URI: http://ir.unimas.my/id/eprint/40275

Actions (For repository members only: login required)

View Item View Item