LATEST MODEL
Llama 4 Maverick
Meta Released April 2025
Meta's balanced multimodal MoE model with 128 experts for general use
Llama 4 Maverick
Meta • April 2025
Latest
Training Data
Up to August 2024
Llama 4 Maverick
April 2025
Parameters
400 billion (17B active)
Training Method
Mixture of Experts
Context Window
1,000,000 tokens
Knowledge Cutoff
August 2024
Key Features
Open Source • Multimodal • 128 Experts MoE
Capabilities
Reasoning: Excellent
Coding: Excellent
Multimodal: Outstanding
What's New in This Version
Balanced performance across tasks with efficient MoE architecture
Meta's balanced multimodal MoE model with 128 experts for general use
What's New in This Version
Balanced performance across tasks with efficient MoE architecture
Technical Specifications
Parameters 400 billion (17B active)
Context Window 1,000,000 tokens
Training Method Mixture of Experts
Knowledge Cutoff August 2024
Training Data Up to August 2024
Key Features
Open Source Multimodal 128 Experts MoE
Capabilities
Reasoning: Excellent
Coding: Excellent
Multimodal: Outstanding
Other Meta Models
Explore more models from Meta
Llama 4 Behemoth
Meta's flagship multimodal model with massive MoE architecture (288B active parameters)
April 2025 ~2 trillion (288B active)
Llama 4 Scout
Meta's efficient multimodal model with industry-leading 10M token context
April 2025 109 billion (17B active)
Llama 3.1 405B
Meta's largest and most capable open-source model
July 2024 405 billion