Comprehensive side-by-side LLM comparison
Mistral Large 2 offers 120.0K more tokens in context window than Grok-3 Mini. Grok-3 Mini is $7.20 cheaper per million tokens. Grok-3 Mini supports multimodal inputs. Both models have their strengths depending on your specific coding needs.
xAI
Grok 3 Mini was developed as an efficient version of Grok 3, designed to bring next-generation capabilities to resource-conscious deployments. Built to provide strong performance with practical efficiency, it extends Grok 3's innovations to broader application scenarios.
Mistral AI
Mistral Large 2 was introduced as the second generation of Mistral's flagship model, designed to provide frontier-level capabilities across diverse language tasks. Built with enhanced reasoning, coding, and multilingual abilities, it represents Mistral's most advanced offering for enterprise and demanding applications.
6 months newer

Mistral Large 2
Mistral AI
2024-07-24

Grok-3 Mini
xAI
2025-02-17
Cost per million tokens (USD)

Grok-3 Mini

Mistral Large 2
Context window and performance specifications
Grok-3 Mini
2024-11-17
Available providers and their performance metrics

Grok-3 Mini
xAI

Mistral Large 2

Grok-3 Mini

Mistral Large 2

Grok-3 Mini

Mistral Large 2
Mistral AI