Search:
Match:
4 results
Research#llm📝 BlogAnalyzed: Dec 29, 2025 02:06

Rakuten Announces Japanese LLM 'Rakuten AI 3.0' with 700 Billion Parameters, Plans Service Deployment

Published:Dec 26, 2025 23:00
1 min read
ITmedia AI+

Analysis

Rakuten has unveiled its Japanese-focused large language model, Rakuten AI 3.0, boasting 700 billion parameters. The model utilizes a Mixture of Experts (MoE) architecture, aiming for a balance between performance and computational efficiency. It achieved high scores on the Japanese version of MT-Bench. Rakuten plans to integrate the LLM into its services with support from GENIAC. Furthermore, the company intends to release it as an open-weight model next spring, indicating a commitment to broader accessibility and potential community contributions. This move signifies Rakuten's investment in AI and its application within its ecosystem.
Reference

Rakuten AI 3.0 is expected to be integrated into Rakuten's services.

Research#llm🏛️ OfficialAnalyzed: Jan 3, 2026 05:52

Introducing Gemma 3 270M: The compact model for hyper-efficient AI

Published:Oct 23, 2025 18:50
1 min read
DeepMind

Analysis

The article announces the release of Gemma 3 270M, a compact language model. It highlights the model's efficiency due to its smaller size (270 million parameters). The focus is on its specialized nature and likely applications where resource constraints are a factor.
Reference

Today, we're adding a new, highly specialized tool to the Gemma 3 toolkit: Gemma 3 270M, a compact, 270-million parameter model.

AI#Video Generation👥 CommunityAnalyzed: Jan 3, 2026 17:07

LTXVideo 13B AI video generation

Published:May 10, 2025 11:59
1 min read
Hacker News

Analysis

The article announces the release or existence of LTXVideo, a 13 billion parameter AI model for video generation. The information is limited to the title and source, so a deeper analysis is impossible without more context. The focus is on the model's size (13B) and its function (video generation).

Key Takeaways

Reference