The 2025 Guangming Science City Forum - Greater Bay Area Intelligent Computing and Large Model Agent Forum was held in Guangming District, Shenzhen. Institutions such as the Pengcheng Laboratory, Panyu Laboratory, and Industrial and Commercial Bank of China jointly announced four major achievements:
- Pengcheng Brain 2.1, an open-source 488B multi-modal model, was released along with a 2TB cleaned dataset and a full-process toolchain
- The domestic FenixCOS multi-card inference engine made its debut, supporting 4096-card parallelism, with a card switching delay less than 3 seconds and a 42% improvement in memory bandwidth utilization
- The meteorological intelligent agent "Afu" was integrated into Pengcheng CloudBrain III, providing 1km × 1km grid forecasting for the 15th National Games
- Industrial and Commercial Bank of China launched the first domestic full-life-cycle financial large model toolkit, covering five stages: requirements, development, testing, operations, and retirement, and has been implemented in 170 business systems
Pengcheng CloudBrain III Update: Total computing power of 1000 PFLOPS by 2026, connected to the "China Computing Network"
Professor Gao Wen, Director of Pengcheng Laboratory, revealed through video that CloudBrain III has completed the second phase expansion of 400 PFLOPS, and the third phase expansion of 600 PFLOPS will be launched in Q2 2026, making it one of the top three global scientific computing facilities. Meanwhile, it has completed 100G dedicated line interconnection with 12 institutions including Wuxi Supercomputing Center and the National Meteorological Information Center, breaking through 3.2 Tbps backbone bandwidth of the "China Computing Network".
Government-Industry-Academia Collaboration: AI Industry Scale in Guangming Aims for 100 Billion Yuan
During the forum, Pengcheng Laboratory signed a four-party cooperation agreement with the Shenzhen Meteorological Bureau, Panyu Laboratory (Huangpu), and the National Supercomputing Wuxi Center, agreeing to comprehensively share data, computing power, models, and talents. Wang Fangcheng, a member of the Guangming District Committee, revealed that nearly 100 AI companies have gathered in the area, and the industry scale exceeded 30 billion yuan in 2025, with a goal to exceed 100 billion yuan by 2027, forming a complete cluster of "computing power-model-application".
Academic Frontiers: Tsinghua University and the University of Hong Kong Share Latest Advances in Multi-Modal Intelligent Agents
Professor Tang Jie from Tsinghua University released "A Training and Evaluation Framework for Multi-Modal Intelligent Agents," proposing a "task-tool-memory" triad architecture, which improved the average success rate by 18% in a 100-task pool test; Associate Professor Luo Ping from the University of Hong Kong presented the "VideoAgent" achievement, achieving long-video cross-modal retrieval and question-answering, achieving SOTA on the 2000-hour open-source video dataset Recall-2000.
AIbase Observation
From open-source models to multi-card inference engines, and then to meteorological and financial vertical intelligent agents, this forum has concentrated on releasing commercial signals of "domestic computing power + large models." With Pengcheng CloudBrain III entering its final stage and the Guangming District's billion-yuan industrial plan taking shape, the Greater Bay Area is expected to become the second growth pole of domestic AI infrastructure after the Yangtze River Delta. Developers can pay close attention to the upcoming open-sourcing of FenixCOS and the weights of Pengcheng Brain 2.1, and experience the thousand-card inference and 488B multi-modal capabilities ahead of time.
