NVIDIA’s near-monopoly position in the AI GPU battlefield has prompted various manufacturers to accelerate the research and production of Application-Specific Integrated Circuits (ASICs), creating an atmosphere of “competition between NVIDIA’s GPUs and the ASIC army.” Notably, TSMC, which caters to both camps, has increased prices for its 2nm and advanced packaging technologies, further boosting its revenue, which remains unaffected by tariffs and exchange rate fluctuations. The market expects TSMC to maintain record-high performance in 2025 and 2026.

NVIDIA CEO Jensen Huang stated a year ago that the company would enter the ASIC battlefield, but a year later, he reversed this strategy, advocating for “cooperation instead of competition,” suggesting that the AI market should be expanded collectively.
Supply chain professionals indicate that the long-term demand for AI remains strong, with both AI GPU and ASIC chip shipments expected to grow over the next three years. The overall shipment volume of ASIC chips is conservatively estimated to grow by at least 50% annually, with the two industry leaders, NVIDIA and Broadcom, expected to see significant performance growth, while TSMC, which secures large orders from both companies, is set to reap substantial profits.
The emergence of generative AI has accelerated the arrival of the AI era, driving a surge in demand for AI servers. NVIDIA, leveraging its powerful CUDA software technology alongside GPUs, has seized a leading position in the server industry chain, with various companies vying for collaboration.
However, according to supply chain sources, NVIDIA’s AI GPUs are widely used for training and inference of various AI models, but since the demand explosion in 2023, supply has not met demand. The high prices and closed CUDA software ecosystem have led many cloud service providers (CSPs) and chip manufacturers, who find themselves at a disadvantage in terms of supply and pricing power, to accelerate the development of their own ASICs to mitigate risks and ultimately aim for “de-NVIDIA-ization.”
Supply chain insiders further point out that the common goal of the ASIC army is to reduce dependence on NVIDIA’s expensive GPUs and CUDA. NVIDIA’s H100 and the latest B200 chips are priced at tens of thousands of dollars, with complete server systems costing millions. Although the initial development costs of self-developed ASICs are high, effective scaling can significantly reduce costs.
Therefore, major customers of NVIDIA, such as Google, AWS, Meta, and Microsoft, are not only purchasing NVIDIA’s GPUs but also looking to enhance their own chip capabilities. Additionally, they are placing orders directly with foundries to avoid capacity constraints.
Moreover, Broadcom, considered NVIDIA’s top competitor in the ASIC space, has also surprised the market with its performance growth.
Supply chain sources indicate that a year ago, Huang stated the intention to establish an ASIC department to further expand the customer base, but a year later, the strategy shifted dramatically with the announcement of the NVLink Fusion strategy, allowing various industries to utilize the large partner ecosystem built on NVLink to create semi-customized AI infrastructure.
Companies including MediaTek, Marvell, WorldChip, Astera Labs, Synopsys, Cadence, Fujitsu, and Qualcomm are included in this ecosystem, but Broadcom is notably absent from the collaboration list, raising further interest in the future competitive and cooperative relationship between NVIDIA and Broadcom.
However, regarding the competition between ASIC chips and AI GPUs, TSMC Chairman Wei Zhejia stated:

Supply chain professionals indicate that Samsung Electronics and Intel have fallen far behind TSMC in the sub-7nm battlefield. From 7nm to the upcoming 2nm mass production in the second half of the year, TSMC has almost secured all major orders, with AI customers all opting for TSMC, which offers a complete manufacturing advantage with CoWoS and other technologies.
TSMC’s confidence in predicting revenue growth from AI accelerators shows a compound annual growth rate (CAGR) of nearly 45% over the next five years starting in 2024. Additionally, supply chain estimates suggest that TSMC’s total CoWoS capacity for 2025 is conservatively estimated at around 650,000 units, with NVIDIA securing over half of that capacity at 370,000 units. ASIC customers are also expected to see significant growth, with projections for CoWoS capacity reaching 730,000 units in 2026, and ASIC chip shipments expected to grow by over 50% annually from 2025 to 2027.
Among these, the ASIC chip capacities for Meta, xAI, and others will gradually come online, with the market anticipating a significant increase in ASIC server volumes from these three companies in 2026. Coupled with AWS, Google, and Microsoft, the competition in AI server platforms is expected to heat up, leading to more collaboration opportunities in the upstream and downstream supply chains.
Source:DIGITIMES