HyperAIHyperAI

Command Palette

Search for a command to run...

Cooperative diffusion-autoregressive Paradigm SDAR

Date

3 days ago

Organization

Shanghai Artificial Intelligence Laboratory

Paper URL

2510.06303

Synergistic Diffusion-AutoRegression (SDAR) was proposed by the Shanghai Artificial Intelligence Laboratory in October 2025, and the relevant research results were published in the paper "SDAR: A Synergistic Diffusion-AutoRegression Paradigm for Scalable Sequence Generation".

SDAR is a cooperative diffusion-autoregressive paradigm that establishes a novel language modeling framework. It combines the training efficiency of autoregression with the parallel inference capabilities of diffusion, aiming to reconcile the efficiency of autoregressive training with the parallelism of diffusion-based inference. Its key principle is decoupling two stages: utilizing full-scale AR pre-training to ensure stability and efficiency, and then introducing a lightweight adaptation stage to enable the model to perform block-based diffusion decoding. This design retains the practical advantages of AR—such as key-value caching, variable-length generation, and robust optimization behavior—while unleashing the unique intra-block parallel generation advantages of diffusion.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp