DiDi-Instruct Post-Training Method
DiDi-Instruct (Discrete Diffusion Divergence Instruction) was proposed in September 2025 by a research team from Purdue University, the University of Texas, and Xiaohongshu hi-lab, among other institutions. The related research results were published in the paper "[…]".Ultra-Fast Language Generation via Discrete Diffusion Divergence Instruct".
The Discrete Diffusion Instruction (DiDi-Instruct) is a novel distillation framework for fast language generation. It initializes a pre-trained (masked) Discrete Diffusion Language Model (dLLM) and distills a student model in a few steps for rapid generation. The resulting DiDi-Instruct model achieves performance comparable to or better than its dLLM teacher and GPT-2 baseline, while achieving up to 64x speedup.
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.