Ask2Transformers: Zero-Shot Domain labelling with Pre-trained Language Models
Ask2Transformers: Zero-Shot Domain labelling with Pre-trained Language Models
Oscar Sainz German Rigau

Abstract
In this paper we present a system that exploits different pre-trained Language Models for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained Language Models and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| domain-labelling-on-babeldomains | A2T | F1-Score: 92.14 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.