Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
Jun Suzuki; Masaaki Nagata

Abstract
This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| text-summarization-on-duc-2004-task-1 | EndDec+WFE | ROUGE-1: 32.28 ROUGE-2: 10.54 ROUGE-L: 27.8 |
| text-summarization-on-gigaword | EndDec+WFE | ROUGE-1: 36.30 ROUGE-2: 17.31 ROUGE-L: 33.88 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.