Search:
Match:
1 results
Research#llm🏛️ OfficialAnalyzed: Jan 3, 2026 05:52

Introducing Gemma 3 270M: The compact model for hyper-efficient AI

Published:Oct 23, 2025 18:50
1 min read
DeepMind

Analysis

The article announces the release of Gemma 3 270M, a compact language model. It highlights the model's efficiency due to its smaller size (270 million parameters). The focus is on its specialized nature and likely applications where resource constraints are a factor.
Reference

Today, we're adding a new, highly specialized tool to the Gemma 3 toolkit: Gemma 3 270M, a compact, 270-million parameter model.