Microsoft

Phi-3.5-MoE Instruct

by Microsoft

+
+
+
+
About

Phi-3.5-MoE-instruct is a sparse mixture-of-experts model from Microsoft's Phi research team, released in August 2024 with 42 billion total parameters across 16 experts and approximately 6.6 billion active parameters per forward pass. The model applies Microsoft's small-data, high-quality training philosophy — developed across earlier Phi generations — to a MoE architecture, targeting reasoning quality comparable to much larger dense models at a fraction of the inference compute. Released under the MIT license, it was notable in the research community for demonstrating that MoE efficiency gains could be realized at smaller total parameter counts than typical large-scale MoE deployments.

+
+
+
+
Timeline
ReleasedAug 22, 2024
+
+
+
+
License & Family
License
MIT
Performance Overview
Performance metrics and category breakdown

Overall Performance

1 benchmarks
Average Score
80.8%
Best Score
80.8%
High Performers (80%+)
1

Top Categories

Coding
80.8%
+
+
+
+
All Benchmark Results for Phi-3.5-MoE Instruct
Complete list of benchmark scores with detailed information
MBPP
Coding
80.80
80.8%
Unverified