An advantage in computing power no longer automatically translates into an advantage in efficiency. 算力优势,不再自动等同于效率优势。

Hu Jiaqi (胡嘉琦), “Nvidia’s Next Battle: Not Chips, but the Computing Power Ecosystem” (英伟达的下一战:不是芯片,是算力体系), Business Management Review (商学院), April 19, 2026

The AI industry is beginning to talk about computing power as a “system.” From chip makers to analysts, the discussion is moving beyond GPUs, model training, and individual performance metrics toward infrastructure, energy, networking, scheduling, and full-stack coordination. This is a model of organization that China has spent the past decade constructing and has already elevated to the level of national strategy. The AI industry is discovering what China has already systematized.

This analysis is based on journalist Hu Jiaqi’s recent article in Business Management Review, “Nvidia’s Next Battle: Not Chips, but the Computing Power Ecosystem.” Business Management Review is supervised by the Chinese Academy of Social Sciences, hosted by the CASS Institute of Industrial Economics, and published by China Business News. Hu’s article examines NVIDIA’s evolving approach to AI chips and computing power, arguing that competition is shifting from single products and single technology stacks toward system-level infrastructure capability. Although Digital China is not mentioned, the implications and parallels are unmistakable.

AI is no longer a market for point solutions, but a systemic competition centered on infrastructure. (Hu Jiaqi, 2026)

According to Hu, NVIDIA CEO Jensen Huang still views GPUs as central, but his focus is shifting toward a unified ecosystem of high-bandwidth memory, hyperscale data centers, optical networks, energy systems, and software platforms. Together, these form what Huang calls “AI factories,” large-scale systems built not merely to store information, but to produce intelligence at industrial scale. In this framing, compute is no longer a standalone asset. It is part of an integrated system that converts energy, hardware, data movement, software, and applications into usable intelligence.

This is the deeper significance of Hu’s article. AI is no longer a market for point solutions. Chips, networks, storage, cooling, data centers, and software are being reorganized into holistic industrial capabilities. The key question is no longer only who has more computing power, but who can define the infrastructure system through which computing power becomes efficient, scalable, schedulable, and economically sustainable.

This is where China has already been operating. Under the framework of Digital China, computing power is treated as a strategic resource and a subcategory of New Type Infrastructure. The National Unified Computing Power Network is designed to coordinate compute across regions, sectors, and use cases. The emphasis is not on maximizing one component in isolation, but on achieving efficiency through integration, scheduling, and system-level coordination.

The key to this competition is no longer “who is stronger,” but “who can more efficiently define infrastructure.” (Hu Jiaqi, 2026)

From this perspective, NVIDIA’s evolution aligns with China’s approach, even though it emerges from a different institutional logic. Huang’s “five-layer” architecture — energy, chips, infrastructure, models, and applications — closely mirrors the system-of-systems logic that underpins the Digital China stack. Both frameworks treat compute not as an isolated input, but as part of a layered infrastructure system that links energy, hardware, networks, software, data, models, and applications.

The important point is not that NVIDIA copied China. There is no evidence for that claim. Rather, both sides are responding to the same underlying constraints of large-scale AI systems. Huang is approaching the problem through engineering, product architecture, and industrial optimization. Beijing is approaching it through state planning, infrastructure policy, and system governance. But both are converging on the same conclusion: compute must be organized as infrastructure.

That convergence matters. The emerging competition over computing power is not only technological or commercial. It is structural. How infrastructure is defined affects cost, standards, energy use, ecosystem control, and the baseline on which other actors must operate. In the “AI factory” era, advantage will not belong simply to whoever has the strongest chips or the largest compute clusters. It will belong to those who can most effectively integrate energy, compute, networks, software, and applications into a scalable system for producing intelligence.

I use AI tools to support my editing, research, and translation process. Learn more on my AI Transparency Page.