Source
LREC-COLING
DATE OF PUBLICATION
05/25/2024
Authors
Elena Tutubalina Natalia Loukachevitch Andrey Sakhovskiy
Share

Biomedical Concept Normalization over Nested Entities with Partial UMLS Terminology in Russian

Abstract

We present a new manually annotated dataset of PubMed abstracts for concept normalization in Russian. It contains over 23,641 entity mentions in 756 documents linked to 4,544 unique concepts from the UMLS ontology. Compared to existing corpora, we explore two novel annotation characteristics: the nestedness of named entities and the incompleteness of the Russian medical terminology in UMLS. 4,424 entity mentions are linked to 1,535 unique English concepts absent in the Russian part of the UMLS ontology. We present several baselines for normalization over nested named entities obtained with state-of-the-art models such as SapBERT. Our experimental results show that models pre-trained on graph structural data from UMLS achieve superior performance in a zero-shot setting on bilingual terminology.

Join AIRI