Search:
Match:
2 results
Research#llm📝 BlogAnalyzed: Dec 29, 2025 02:06

Rakuten Announces Japanese LLM 'Rakuten AI 3.0' with 700 Billion Parameters, Plans Service Deployment

Published:Dec 26, 2025 23:00
1 min read
ITmedia AI+

Analysis

Rakuten has unveiled its Japanese-focused large language model, Rakuten AI 3.0, boasting 700 billion parameters. The model utilizes a Mixture of Experts (MoE) architecture, aiming for a balance between performance and computational efficiency. It achieved high scores on the Japanese version of MT-Bench. Rakuten plans to integrate the LLM into its services with support from GENIAC. Furthermore, the company intends to release it as an open-weight model next spring, indicating a commitment to broader accessibility and potential community contributions. This move signifies Rakuten's investment in AI and its application within its ecosystem.
Reference

Rakuten AI 3.0 is expected to be integrated into Rakuten's services.

Research#LLM👥 CommunityAnalyzed: Jan 10, 2026 15:45

LWM: Open-Source LLM Boasts 1 Million Token Context Window

Published:Feb 16, 2024 15:54
1 min read
Hacker News

Analysis

The announcement of LWM, an open-source LLM, signals a significant advancement in accessible AI. The substantial 1 million token context window could enable complex reasoning and generation tasks previously unavailable in open-source models.
Reference

LWM is an open LLM.