Integrating Hierarchical Attentions for Future Subevent Prediction

Linmei Hu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

An Event, containing a sequence of subevents, describes a typical thing that happens at a specific time and place. Predicting next probable subevents based on knowledge acquired from large-scale news documents are very important for many real-world applications, such as disaster warning etc. In this paper, we present a novel hierarchical attention based end-to-end model for future (unknown) subevent prediction using large-scale historical events. Our model automatically produces a short text which describes a possible future subevent after consuming the texts describing previous subevents. To boost the model's understanding towards subevent sequence, we design a hierarchical LSTM model to compress the knowledge in both the word sequence for a subevent and the subevent sequence for an event. In addition, topic information has been exploited to make context-aware prediction for future subevents. To further consider which subevents and words play a critical role in prediction, we propose a hierarchical attention mechanism to stress on the important previous subevents as well as the the critical words within them. Experimental results on a real-world dataset demonstrate the superiority of our model for future subevent prediction over state-of-the-art methods.

Original languageEnglish
Article number8941128
Pages (from-to)3106-3114
Number of pages9
JournalIEEE Access
Volume8
DOIs
Publication statusPublished - 2020
Externally publishedYes

Keywords

  • Future subevent prediction
  • hierarchical attentions
  • subevent sequence

Fingerprint

Dive into the research topics of 'Integrating Hierarchical Attentions for Future Subevent Prediction'. Together they form a unique fingerprint.

Cite this