HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Named Entity Recognition (NER)
Named Entity Recognition On Wnut 2017
Named Entity Recognition On Wnut 2017
Metrics
F1
Results
Performance results of various models on this benchmark
Columns
Model Name
F1
Paper Title
CL-KL
60.45
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
RoBERTa + SubRegWeigh (K-means)
60.29
SubRegWeigh: Effective and Efficient Annotation Weighing with Subword Regularization
BERT-CRF (Replicated in AdaSeq)
59.69
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
RoBERTa-BiLSTM-context
59.20
Supplementary Features of BiLSTM for Enhanced Sequence Labeling
BERT + RegLER
58.9
Regularization for Long Named Entity Recognition
TNER -xlm-r-large
58.5
T-NER: An All-Round Python Library for Transformer-based Named Entity Recognition
HGN
57.41
Hero-Gang Neural Model For Named Entity Recognition
ASA + RoBERTa
57.3
Adversarial Self-Attention for Language Understanding
BERTweet
56.5
BERTweet: A pre-trained language model for English Tweets
MINER
54.86
MINER: Improving Out-of-Vocabulary Named Entity Recognition from an Information Theoretic Perspective
GoLLIE
54.3
GoLLIE: Annotation Guidelines improve Zero-Shot Information-Extraction
Truecase
52.3
Robust Named Entity Recognition with Truecasing Pretraining
AESINER
50.68
Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information
InferNER
50.52
InferNER: an attentive model leveraging the sentence-level information for Named Entity Recognition in Microblogs
SA-NER
50.36
Named Entity Recognition for Social Media Texts with Semantic Augmentation
CrossWeigh + Pooled Flair
50.03
CrossWeigh: Training Named Entity Tagger from Imperfect Annotations
ASA + BERT-base
49.8
Adversarial Self-Attention for Language Understanding
Aguilar et al.
45.55
Modeling Noisiness to Recognize Named Entities using Multitask Neural Networks on Social Media
NeuralCRF+SAC
44.77
Similarity Based Auxiliary Classifier for Named Entity Recognition
Cross-BiLSTM-CNN
42.85
Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER
0 of 23 row(s) selected.
Previous
Next
HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Named Entity Recognition (NER)
Named Entity Recognition On Wnut 2017
Named Entity Recognition On Wnut 2017
Metrics
F1
Results
Performance results of various models on this benchmark
Columns
Model Name
F1
Paper Title
CL-KL
60.45
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
RoBERTa + SubRegWeigh (K-means)
60.29
SubRegWeigh: Effective and Efficient Annotation Weighing with Subword Regularization
BERT-CRF (Replicated in AdaSeq)
59.69
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
RoBERTa-BiLSTM-context
59.20
Supplementary Features of BiLSTM for Enhanced Sequence Labeling
BERT + RegLER
58.9
Regularization for Long Named Entity Recognition
TNER -xlm-r-large
58.5
T-NER: An All-Round Python Library for Transformer-based Named Entity Recognition
HGN
57.41
Hero-Gang Neural Model For Named Entity Recognition
ASA + RoBERTa
57.3
Adversarial Self-Attention for Language Understanding
BERTweet
56.5
BERTweet: A pre-trained language model for English Tweets
MINER
54.86
MINER: Improving Out-of-Vocabulary Named Entity Recognition from an Information Theoretic Perspective
GoLLIE
54.3
GoLLIE: Annotation Guidelines improve Zero-Shot Information-Extraction
Truecase
52.3
Robust Named Entity Recognition with Truecasing Pretraining
AESINER
50.68
Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information
InferNER
50.52
InferNER: an attentive model leveraging the sentence-level information for Named Entity Recognition in Microblogs
SA-NER
50.36
Named Entity Recognition for Social Media Texts with Semantic Augmentation
CrossWeigh + Pooled Flair
50.03
CrossWeigh: Training Named Entity Tagger from Imperfect Annotations
ASA + BERT-base
49.8
Adversarial Self-Attention for Language Understanding
Aguilar et al.
45.55
Modeling Noisiness to Recognize Named Entities using Multitask Neural Networks on Social Media
NeuralCRF+SAC
44.77
Similarity Based Auxiliary Classifier for Named Entity Recognition
Cross-BiLSTM-CNN
42.85
Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER
0 of 23 row(s) selected.
Previous
Next