《TAIPEI TIMES》 Nvidia Corp’s GB300 platform to lead AI servers for this year
2026/01/20 03:00
Visitors listen to the introduction of NVIDIA GB300 NVL72 AI server during the Hon Hai Tech Day at the Nangang Exhibition Center in Taipei, Nov. 21, 2025. Photo: AP
By Meryl Kao / Staff reporter
Nvidia Corp’s GB300 platform is expected to account for 70 to 80 percent of global artificial intelligence (AI) server rack shipments this year, while adoption of its next-generation Vera Rubin 200 platform is to gradually gain momentum after the third quarter of the year, TrendForce Corp (集邦科技) said.
Servers based on Nvidia’s GB300 chips entered mass production last quarter and they are expected to become the mainstay models for Taiwanese server manufacturers this year, Trendforce analyst Frank Kung (龔明德) said in an interview.
This year is expected to be a breakout year for AI servers based on a variety of chips, as shipments of graphics processing unit (GPU)-based rack systems — including Nvidia’s GB300 and Vera Rubin 200 platforms, as well as Advanced Micro Devices Inc’s MI400 — are set to accelerate, while cloud service providers (CSPs), such as Google Inc, Amazon Web Services and Meta Platforms Inc, are expected to step up use of application-specific integrated circuit (ASIC)-based AI infrastructure, he said.
As development of GPU and ASIC-based servers accelerates this year, the threshold for system integration among local server manufacturers is rising, Kung said.
Hon Hai Precision Industry Co (鴻海精密), Quanta Computer Inc (廣達) and Wistron Corp (緯創) are some of the Taiwanese server system integration suppliers for overseas tech giants.
The challenge is becoming more pronounced as major CSPs continue to develop ASICs with increasingly advanced and customized specifications, a trend that further raises technical barriers to entry for manufacturers, he said.
GB300 is one of Nvidia’s Blackwell GPU series, with modest upgrades compared with the GB200 across connectors, substrates and thermal components, while the Vera Rubin 200 platform features a significant increase in GPU power consumption, analysts said.
High power consumption in AI servers based on Nvidia chips, coupled with the CSPs’ continued upscaling of AI data centers, results in a growing need for liquid cooling solutions this year, TrendForce analyst Fiona Chiu (邱珮雯) said.
As the market’s heat dissipation solutions remain concentrated on liquid-to-air designs — which serve as a transitional phase between traditional air cooling and full liquid cooling — liquid-to-air systems are still expected to dominate AI infrastructure deployments this year, she said.
Fully liquid-to-liquid cooling solutions are expected to become more prominent next year as data center power density continues to rise, she added.
The developments would benefit Taiwanese liquid-cooling solution providers, with heat dissipation specialists such as Auras Technology Co (雙鴻科技) and Asia Vital Components Co (奇鋐科技) poised to strengthen their positions in the market, Chiu said.
Power supply solution providers such as Delta Electronics Inc (台達電) and Lite-On Technology Corp (光寶科技) are also expected to gain market share with their power supply and infrastructure-related products, she added.
新聞來源:TAIPEI TIMES
