Источник
AACL-IJCNLP
Год публикации
2023
Авторы
Александр Панченко Chris Biemann Ирина Никишина Polina Chernomorchenko Anastasiia Demidova
Поделиться

Predicting Terms in IS-A Relations with Pre-trained Transformers

Аннотация

In this paper, we explore the ability of the generative transformers to predict objects in IS-A (hypo-hypernym) relations. We solve the task for both directions of the relations: we learn to predict hypernyms given the input word and hyponyms, given the input concept and its neighbourhood from the taxonomy. To the best of our knowledge, this is the first paper which provides a comprehensive analysis of transformerbased models for the task of hypernymy extraction. Apart from the standard finetuning of various generative models, we experiment with different input formats and prefixes, zeroand few-shot learning strategies, and generation parameters. Results show that higher performance on both subtasks can be achieved by generative transformers with no additional data (like definitions or lemma names). Such models have phenomenally high abilities at the task given a little training and proper prompts in comparison to specialized rule-based and statistical methods as well as encoder-based transformer models.

Присоединяйтесь к AIRI в соцсетях