Abstractive Text Summarization Based on Neural Fusion
DOI:
https://doi.org/10.32473/flairs.37.1.135561Abstract
Abstractive text summarization, in comparison to extractive text summarization, offers the potential to generate more accurate summaries. In our work, we present a stage-wise abstractive text summarization model that incorporates Elementary Discourse Unit (EDU) segmentation, EDU selection, and EDU fusion. We first segment the articles into a fine-grained form, EDUs, and build a Rhetorical Structure Theory (RST) graph for each article in order to represent the dependencies among EDUs. Those EDUs are encoded in a Graph Attention Networks (GATs), and those with higher importance will be selected as candidates to be fused. The fusing stage is done by BART which merges the selected EDUs into summaries. Our model outperforms the baseline of BART (large) on the CNN/Daily Mail dataset, showing its effectiveness in abstractive text summarization.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Yllias Chali, Wenzhao Zhu
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.