Shanghai AI Lab open-sources trillion-parameter science AI model
The Shanghai Artificial Intelligence Laboratory open-sourced Intern-S1-Pro, a one-trillion-parameter scientific AI model, on Feb 4. The research institution describes the science model as the largest of its kind in the global open-source community.
Intern-S1-Pro adopts an innovative Mixture-of-Experts architecture. It is as if there are 512 "top experts" from various fields on standby within the system. Whenever faced with a scientific problem, the model mobilizes the eight most suitable "experts", which represent about 22 billion parameters, to jointly handle complex mathematical and logical reasoning.
According to the Shanghai AI Lab, Intern-S1-Pro also demonstrated International Mathematical Olympiad and International Physics Olympiad competition-level problem-solving capabilities.
The Shanghai AI Lab creatively introduced "Fourier Position Encoding (FoPE)" and restructured the "temporal encoder" to unify the understanding of signals from microscopic to macroscopic scales.
As its understanding and reasoning abilities continue to improve, Intern-S1-Pro is expected to see broader application in real-world scientific research.
Beyond algorithmic advances, the model is designed to enable end-to-end integration of domestically developed technologies at the computing power level.
Looking ahead, the Shanghai AI Lab plans to further advance full-stack open source development and free commercial use of AI models, collaborating with global academic and industrial communities to build a more open, efficient, and future-oriented scientific AI ecosystem.
Source: Shanghai Observer