PLaMo 3 Support Merged into llama.cpp
Published:Dec 28, 2025 18:55
•1 min read
•r/LocalLLaMA
Analysis
The news highlights the integration of PLaMo 3 model support into the llama.cpp framework. PLaMo 3, a 31B parameter model developed by Preferred Networks, Inc. and NICT, is pre-trained on English and Japanese datasets. The model utilizes a hybrid architecture combining Sliding Window Attention (SWA) and traditional attention layers. This merge suggests increased accessibility and potential for local execution of the PLaMo 3 model, benefiting researchers and developers interested in multilingual and efficient large language models. The source is a Reddit post, indicating community-driven development and dissemination of information.
Key Takeaways
- •PLaMo 3 model support has been added to llama.cpp.
- •PLaMo 3 is a 31B parameter model trained on English and Japanese.
- •The model uses a hybrid architecture with SWA and traditional attention.
Reference
“PLaMo 3 NICT 31B Base is a 31B model pre-trained on English and Japanese datasets, developed by Preferred Networks, Inc. collaborative with National Institute of Information and Communications Technology, NICT.”