A dual relation-encoder network for aspect sentiment triplet extraction

Tian Xia, Xia Sun, Yidong Yang, Yunfei Long, Richard Sutcliffe

Research output: Contribution to journalArticlepeer-review

Abstract

Aspect sentiment triplet extraction (ASTE) combines several subtasks of aspect-based sentiment analysis, which aims to extract aspect terms, opinion terms, and their corresponding sentiment polarities in a sentence. The interaction relations between words have strong cueing information. However, previous ASTE approaches use them indiscriminately, ignoring the emphasis of relations on different subtasks. In order to fully exploit the interaction relations, we designed a multi-task learning method which uses two separate relation-encoder networks, each focusing on a different task. We call this proposed model the dual relation-encoder network (DRN). The two networks are the entity extraction relation-encoder (EER) and the entity matching relation-encoder (EMR), respectively. EER uses multi-channel graph convolutional networks to add semantic and syntactic information to the original embeddings. EMR first fuses different kinds of interaction relations, then employs criss-cross attention to obtain interaction information from other positions in the same row and column, which can provide a global view. Finally, we extract entities by sequence labeling and derive triplets with the help of span-shrunken tags. To validate the efficiency of DRN, we conducted extensive experiments on a benchmark dataset. The experimental results show that our method outperforms the strong baseline models.

Original languageEnglish
Article number128064
JournalNeurocomputing
Volume597
DOIs
Publication statusPublished - 7 Sep 2024
Externally publishedYes

Keywords

  • Aspect sentiment triplet extraction
  • Criss-cross attention
  • Multi-channel GCN
  • Relationship emphasis
  • Tag span shrinking

Fingerprint

Dive into the research topics of 'A dual relation-encoder network for aspect sentiment triplet extraction'. Together they form a unique fingerprint.

Cite this