Yang Liu

Abstract
BERT, a pre-trained Transformer model, has achieved ground-breaking performance on multiple NLP tasks. In this paper, we describe BERTSUM, a simple variant of BERT, for extractive summarization. Our system is the state of the art on the CNN/Dailymail dataset, outperforming the previous best-performed system by 1.65 on ROUGE-L. The codes to reproduce our results are available at https://github.com/nlpyang/BertSum
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| document-summarization-on-cnn-daily-mail | BERTSUM+Transformer | ROUGE-1: 43.25 ROUGE-2: 20.24 ROUGE-L: 39.63 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.