HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Network Pruning
Network Pruning On Imagenet
Network Pruning On Imagenet
Metrics
Accuracy
Results
Performance results of various models on this benchmark
Columns
Model Name
Accuracy
Paper Title
ResNet50-2.3 GFLOPs
78.79
Pruning Filters for Efficient ConvNets
ResNet50-1.5 GFLOPs
78.07
Pruning Filters for Efficient ConvNets
ResNet50 2.5 GFLOPS
78.0
Knapsack Pruning with Inner Distillation
RegX-1.6G
77.97
Group Fisher Pruning for Practical Network Compression
ResNet50 2.0 GFLOPS
77.70
Knapsack Pruning with Inner Distillation
ResNet50-3G FLOPs
77.1
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-2G FLOPs
76.4
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
76.376
Pruning Filters for Efficient ConvNets
TAS-pruned ResNet-50
76.20
Network Pruning via Transformable Architecture Search
ResNet50
75.59
Network Pruning That Matters: A Case Study on Retraining Variants
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
MobileNetV2
73.42
Group Fisher Pruning for Practical Network Compression
ResNet50
73.14
AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks
MobileNetV1-50% FLOPs
70.7
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
SqueezeNet (6-bit Deep Compression)
57.5%
SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size
0 of 16 row(s) selected.
Previous
Next
HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Network Pruning
Network Pruning On Imagenet
Network Pruning On Imagenet
Metrics
Accuracy
Results
Performance results of various models on this benchmark
Columns
Model Name
Accuracy
Paper Title
ResNet50-2.3 GFLOPs
78.79
Pruning Filters for Efficient ConvNets
ResNet50-1.5 GFLOPs
78.07
Pruning Filters for Efficient ConvNets
ResNet50 2.5 GFLOPS
78.0
Knapsack Pruning with Inner Distillation
RegX-1.6G
77.97
Group Fisher Pruning for Practical Network Compression
ResNet50 2.0 GFLOPS
77.70
Knapsack Pruning with Inner Distillation
ResNet50-3G FLOPs
77.1
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-2G FLOPs
76.4
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
76.376
Pruning Filters for Efficient ConvNets
TAS-pruned ResNet-50
76.20
Network Pruning via Transformable Architecture Search
ResNet50
75.59
Network Pruning That Matters: A Case Study on Retraining Variants
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
MobileNetV2
73.42
Group Fisher Pruning for Practical Network Compression
ResNet50
73.14
AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks
MobileNetV1-50% FLOPs
70.7
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
SqueezeNet (6-bit Deep Compression)
57.5%
SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size
0 of 16 row(s) selected.
Previous
Next
Network Pruning On Imagenet | SOTA | HyperAI