On March 12, 2026, the global chip giant NVIDIA dropped a shockwave in the industry. According to Wired, NVIDIA officially announced that it will invest up to 26 billion US dollars (about 178.79 billion Chinese yuan) over the next five years to develop open-source AI large models. This move marks the most profound strategic transformation in NVIDIA's history—evolving from a pure hardware supplier into a cutting-edge artificial intelligence laboratory capable of directly competing with OpenAI and DeepSeek.

Downward Strike: "Competing" with 8 times the budget of GPT-4 to create the strongest model

This 26 billion dollar investment will comprehensively cover model development, computing infrastructure, and top talent reserves. In comparison, OpenAI's cost for training GPT-4 was about 3 billion dollars, while NVIDIA's investment is more than 8 times that amount.

With its absolute control over core computing resources, NVIDIA is secretly assembling a dream team of researchers. Reports indicate that the company has recently completed pre-training on an ultra-large model with 550 billion parameters. NVIDIA emphasized that developing these models is not only for testing computing power but also for conducting "extreme stress tests" on storage, network, and other supercomputing-level infrastructure.

Middle Road: Promoting "open weights" to address enterprise customization pain points

In terms of technical approach, NVIDIA has chosen a highly strategic "middle road." Unlike OpenAI's fully closed-source or Meta's fully open-source approaches, NVIDIA focuses on "open weights":

  • High transparency: Key model parameters are made public, allowing enterprises to download and run them on their own devices.

  • Deep optimization: These models are optimized at the underlying level for NVIDIA's own hardware, achieving much higher performance than general models.

At a time when large enterprises urgently need high transparency and customizable models, this strategy precisely addresses industry pain points and is expected to build a formidable technological moat.

Business Ambition: Targeting 50 billion dollars in additional revenue within three years

Financial analysts predict that NVIDIA's move is not just about burning money. If NVIDIA can secure a 10% share in the foundational model market while maintaining its dominance in hardware, it could generate an additional 50 billion dollars in revenue annually within three years.

The current industry landscape is in a delicate balance: leading U.S. closed-source models mostly offer cloud access only, while Chinese companies like DeepSeek and Alibaba are sweeping the global development community with open-source strategies. NVIDIA's entry into the open-source ecosystem is both a defense of core interests and an effort to define future standards.

Release Timeline: First models to be launched by the end of 2026

It is reported that the related funding will be gradually implemented over the next 18 to 24 months. NVIDIA expects the first open-source large models with industry dominance to be officially released by the end of 2026 or early 2027.

When the king of hardware starts coding himself, the second half of the global AI competition is no longer just about computing power—it's a comprehensive battle of ecosystems, data, and model depth. NVIDIA's 26 billion dollars may completely rewrite the rules of the large model market.