Llama 3.1 70B
Meta Released July 2024
Meta's updated flagship open-source model
Llama 3.1 70B
Meta • July 2024
Training Data
Up to December 2023
Llama 3.1 70B
July 2024
Parameters
70 billion
Training Method
Reinforcement Learning
Context Window
128,000 tokens
Knowledge Cutoff
December 2023
Key Features
Open Source • Long Context • Tool Use
Capabilities
Reasoning: Very Good
Coding: Very Good
Tool Use: Good
What's New in This Version
Extended context and improved tool use capabilities
Meta's updated flagship open-source model
What's New in This Version
Extended context and improved tool use capabilities
Technical Specifications
Parameters 70 billion
Context Window 128,000 tokens
Training Method Reinforcement Learning
Knowledge Cutoff December 2023
Training Data Up to December 2023
Key Features
Open Source Long Context Tool Use
Capabilities
Reasoning: Very Good
Coding: Very Good
Tool Use: Good
Other Meta Models
Explore more models from Meta
Llama 4 Behemoth
Meta's flagship multimodal model with massive MoE architecture (288B active parameters)
April 2025 ~2 trillion (288B active)
Llama 4 Maverick
Meta's balanced multimodal MoE model with 128 experts for general use
April 2025 400 billion (17B active)
Llama 4 Scout
Meta's efficient multimodal model with industry-leading 10M token context
April 2025 109 billion (17B active)