The global open-source AI community has reached a major milestone. The Shanghai Artificial Intelligence Laboratory officially released and open-sourced the bookish Intern-S1-Pro, a scientific multimodal large model with a scale of 1 trillion parameters. This model is built on an innovative "general-special fusion" technology architecture called SAGE, not only breaking the record for parameter scale in the open-source community, but also achieving significant breakthroughs in multiple core scientific capabilities.

As the largest scientific multimodal model in the current open-source field, the comprehensive subject evaluation of Intern-S1-Pro has consistently ranked at the international leading level in the AI4S (AI for Science) field. Its complex mathematical and logical reasoning ability has reached the level of Olympiad gold medalists, and it has demonstrated strong performance in intelligent agent tasks oriented towards real research processes, placing it among the top tier of open-source models.
At the technical level, Intern-S1-Pro adopts a Mixture of Experts (MoE) architecture, with a total parameter count of up to 1T (trillion). Through an efficient routing mechanism, only 22B parameters are activated per call to generate high-quality output. To give the model a "physical intuition" that can understand both microscopic signals and macroscopic physical laws, the development team introduced Fourier position encoding and restructured the temporal encoder, overcoming challenges related to stability and computational efficiency in training ultra-large-scale MoE models.
Notably, the emergence of Intern-S1-Pro has validated a complete technological chain from original model architecture to independent domestic computing infrastructure. Currently, the model is fully open-sourced on platforms such as GitHub and HuggingFace, and provides an online experience, laying a solid foundation for building an open and shared scientific research infrastructure.
Online Experience Link: https://chat.intern-ai.org.cn/
GitHub Link: https://github.com/InternLM/Intern-S1
HuggingFace Link: https://huggingface.co/internlm/Intern-S1-Pro
Key Points:
🏆 The Largest Open-Source Scale Globally: Intern-S1-Pro has 1T (trillion) parameters and is currently the largest scientific multimodal large model in the global open-source community in terms of parameter scale.
🧬 Top Scientific Reasoning Ability: The model reaches the level of Olympiad gold medalists in mathematical and logical reasoning, and is internationally leading in high-difficulty comprehensive subject evaluations in the AI4S field.
💻 Breakthrough in Independent Technology Chain: The model is based on the SAGE architecture and domestic computing infrastructure, achieving extremely high computational efficiency and training stability through MoE technology.
