HyperAIHyperAI

Command Palette

Search for a command to run...

MultiPL-MoE Architecture

Date

3 days ago

Paper URL

2508.19268

Tags

MultiPL-MoE was proposed by the China Mobile Research Institute in August 2025, and the relevant research results were published in the paper "MultiPL-MoE: Multi-Programming-Lingual Extension of Large Language Models through Hybrid Mixture-of-ExpertsThe study has been accepted by EMNLP 2025.

MultiPL-MoE is a multi-programming-language (MultiPL) extension LLMs implemented through hybrid expert hybrid (MoE). Unlike previous MoE methods for multi-natural-language extensions, which focus on internal language consistency by predicting subsequent tags but neglect the internal syntactic structure of programming languages, MultiPL-MoE employs a novel hybrid MoE architecture for fine-grained learning of tag-level semantic features and fragment-level syntactic features, enabling the model to infer code structure by recognizing and classifying different syntactic elements.

MultiPL-MoE Overall Framework

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp