Abstractive Text Summarization Based on Neural Fusion





Abstractive text summarization, in comparison to extractive text summarization, offers the potential to generate more accurate summaries. In our work, we present a stage-wise abstractive text summarization model that incorporates Elementary Discourse Unit (EDU) segmentation, EDU selection, and EDU fusion. We first segment the articles into a fine-grained form, EDUs, and build a Rhetorical Structure Theory (RST) graph for each article in order to represent the dependencies among EDUs. Those EDUs are encoded in a Graph Attention Networks (GATs), and those with higher importance will be selected as candidates to be fused. The fusing stage is done by BART which merges the selected EDUs into summaries. Our model outperforms the baseline of BART (large) on the CNN/Daily Mail dataset, showing its effectiveness in abstractive text summarization.




How to Cite

Chali, Y., & Zhu, W. (2024). Abstractive Text Summarization Based on Neural Fusion. The International FLAIRS Conference Proceedings, 37(1). https://doi.org/10.32473/flairs.37.1.135561