DeepSeek Models

Chinese innovation • MoE architecture

10 Models Latest: DeepSeek-V3.2

DeepSeek-V3.2

Released December 2025

LATEST

DeepSeek's latest flagship model matching GPT-5 performance with integrated tool-use thinking

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Tool-use Integration Thinking Modes Agent Training +1 more
View Details →

DeepSeek-V3.2-Speciale

Released December 2025

DeepSeek's competition-focused variant (EXPIRED Dec 15, 2025 - was temporary API-only release)

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Expired IMO Gold CMO Gold +1 more
View Details →

DeepSeek-V3.1

Released August 2025

DeepSeek's hybrid model combining V3 and R1 strengths

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Hybrid Architecture Code Agent Search Agent +1 more
View Details →

DeepSeek-R1-0528

Released May 2025

DeepSeek's upgraded reasoning model with 87.5% AIME accuracy and significantly reduced hallucinations

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Enhanced Reasoning System Prompts JSON Output +2 more
View Details →

DeepSeek-V3-0324

Released March 2025

DeepSeek's updated V3 model with improved post-training using R1 techniques

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Improved Post-training R1-enhanced Advanced Performance
View Details →

DeepSeek-R1

Released January 2025

DeepSeek's reasoning model competing with OpenAI o1, released under MIT license

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Open Source (MIT) Advanced Reasoning Cost Effective +1 more
View Details →

DeepSeek-V3

Released December 2024

DeepSeek's latest flagship model with enhanced capabilities and efficiency

Parameters
671 billion (37B active)
Context
128,000 tokens
Key Features
Advanced Architecture Improved Efficiency Enhanced Performance
View Details →

DeepSeek Coder V2

Released June 2024

DeepSeek's specialized coding model with advanced programming capabilities

Parameters
236 billion (21B active)
Context
128,000 tokens
Key Features
Code Specialization Multi-language Repository Understanding
View Details →

DeepSeek-V2

Released May 2024

DeepSeek's flagship MoE model with exceptional efficiency

Parameters
236 billion (21B active)
Context
128,000 tokens
Key Features
Mixture of Experts High Efficiency Open Source
View Details →

DeepSeek Math

Released February 2024

DeepSeek's mathematics-specialized model

Parameters
7 billion
Context
4,096 tokens
Key Features
Math Specialization Problem Solving Step-by-step
View Details →
Theme
Language
Support
© funclosure 2025