A Flat Tree-based Transformer for Nested Named Entity Recognition

Hongli Mao, Xian Ling Mao*, Hanlin Tang, Xiaoyan Gao, Chun Xu, Heyan Huang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Nested Named Entity Recognition (nested NER), which aims to extract entities from overlapping spans, is a challenging task in natural language processing. Recently, span-based methods have achieved the best performance in this task by modeling the one-dimensional text sequence as a two-dimensional matrix. However, existing span-based methods are hard to accurately identify entity boundaries due to the interference of confusing spans, and they also fail to consider nested entity dependency relationships. To address these limitations, we propose a novel Flat Tree-based Transformer that utilizes the constituency parsing tree to assist nested NER. Specifically, we first employ the constituency parsing tree to generate candidate spans and only tree nodes are selected as potential entities, which can eliminate confusing spans and obtain a clearer and more discrete distribution of the entity boundaries. Then the Flat Tree-based Transformer leverages two self-attention units to capture nested dependency relationships among candidate spans, incorporating parsing tree structure while not affecting the Transformer's parallelizability. Extensive experiments on five widely-adopted datasets demonstrate our proposed method performs better than previous state-of-the-art baselines methods.

Original languageEnglish
Article number113405
JournalKnowledge-Based Systems
Volume318
DOIs
Publication statusPublished - 7 Jun 2025
Externally publishedYes

Keywords

  • Constituency parsing tree
  • Flat Tree-Based Transformer
  • Nested Named Entity Recognition
  • Span-based methods

Fingerprint

Dive into the research topics of 'A Flat Tree-based Transformer for Nested Named Entity Recognition'. Together they form a unique fingerprint.

Cite this