Rakuten Announces Japanese LLM 'Rakuten AI 3.0' with 700 Billion Parameters, Plans Service Deployment

Research#llm📝 Blog|Analyzed: Dec 29, 2025 02:06
Published: Dec 26, 2025 23:00
1 min read
ITmedia AI+

Analysis

Rakuten has unveiled its Japanese-focused large language model, Rakuten AI 3.0, boasting 700 billion parameters. The model utilizes a Mixture of Experts (MoE) architecture, aiming for a balance between performance and computational efficiency. It achieved high scores on the Japanese version of MT-Bench. Rakuten plans to integrate the LLM into its services with support from GENIAC. Furthermore, the company intends to release it as an open-weight model next spring, indicating a commitment to broader accessibility and potential community contributions. This move signifies Rakuten's investment in AI and its application within its ecosystem.
Reference / Citation
View Original
"Rakuten AI 3.0 is expected to be integrated into Rakuten's services."
I
ITmedia AI+Dec 26, 2025 23:00
* Cited for critical analysis under Article 32.