Rakuten Announces Japanese LLM 'Rakuten AI 3.0' with 700 Billion Parameters, Plans Service Deployment
Analysis
Rakuten has unveiled its Japanese-focused large language model, Rakuten AI 3.0, boasting 700 billion parameters. The model utilizes a Mixture of Experts (MoE) architecture, aiming for a balance between performance and computational efficiency. It achieved high scores on the Japanese version of MT-Bench. Rakuten plans to integrate the LLM into its services with support from GENIAC. Furthermore, the company intends to release it as an open-weight model next spring, indicating a commitment to broader accessibility and potential community contributions. This move signifies Rakuten's investment in AI and its application within its ecosystem.
Key Takeaways
- •Rakuten has developed a Japanese-focused LLM with 700 billion parameters.
- •The model uses a Mixture of Experts (MoE) architecture for efficiency.
- •Rakuten plans to deploy the LLM in its services and release it as an open-weight model.
“Rakuten AI 3.0 is expected to be integrated into Rakuten's services.”