Command Palette
Search for a command to run...
Nils Reimers; Iryna Gurevych

摘要
BERT(Devlin等人,2018年)和RoBERTa(Liu等人,2019年)在诸如语义文本相似度(STS)等句子对回归任务中取得了新的最先进性能。然而,这些模型需要将两个句子同时输入网络,导致巨大的计算开销:使用BERT在包含10,000个句子的集合中找到最相似的一对句子大约需要5,000万次推理计算(约65小时)。由于BERT的构建方式,它不适合用于语义相似度搜索以及无监督任务如聚类。在本文中,我们介绍了Sentence-BERT(SBERT),这是预训练的BERT网络的一种改进版本,通过使用孪生网络和三元组网络结构来生成具有语义意义的句子嵌入向量,这些向量可以通过余弦相似度进行比较。这将寻找最相似句对的时间从使用BERT/RoBERTa所需的65小时减少到使用SBERT的大约5秒,同时保持了BERT的准确性。我们在常见的STS任务和迁移学习任务上评估了SBERT和SRoBERTa的表现,结果表明它们优于其他最先进的句子嵌入方法。
代码仓库
基准测试
| 基准 | 方法 | 指标 |
|---|---|---|
| semantic-textual-similarity-on-sick | SRoBERTa-NLI-large | Spearman Correlation: 0.7429 |
| semantic-textual-similarity-on-sick | SRoBERTa-NLI-base | Spearman Correlation: 0.7446 |
| semantic-textual-similarity-on-sick | SBERT-NLI-base | Spearman Correlation: 0.7291 |
| semantic-textual-similarity-on-sick | SBERT-NLI-large | Spearman Correlation: 0.7375 |
| semantic-textual-similarity-on-sick | SentenceBERT | Spearman Correlation: 0.7462 |
| semantic-textual-similarity-on-sts-benchmark | SRoBERTa-NLI-STSb-large | Spearman Correlation: 0.8615 |
| semantic-textual-similarity-on-sts-benchmark | SBERT-NLI-base | Spearman Correlation: 0.7703 |
| semantic-textual-similarity-on-sts-benchmark | SRoBERTa-NLI-base | Spearman Correlation: 0.7777 |
| semantic-textual-similarity-on-sts-benchmark | SBERT-NLI-large | Spearman Correlation: 0.79 |
| semantic-textual-similarity-on-sts-benchmark | SBERT-STSb-base | Spearman Correlation: 0.8479 |
| semantic-textual-similarity-on-sts-benchmark | SBERT-STSb-large | Spearman Correlation: 0.8445 |
| semantic-textual-similarity-on-sts12 | SRoBERTa-NLI-large | Spearman Correlation: 0.7453 |
| semantic-textual-similarity-on-sts13 | SBERT-NLI-large | Spearman Correlation: 0.7846 |
| semantic-textual-similarity-on-sts14 | SBERT-NLI-large | Spearman Correlation: 0.7490000000000001 |
| semantic-textual-similarity-on-sts15 | SRoBERTa-NLI-large | Spearman Correlation: 0.8185 |
| semantic-textual-similarity-on-sts16 | SRoBERTa-NLI-large | Spearman Correlation: 0.7682 |