On the flip side, things are gonna get crazy for HPC chips. The ban is expanding control over the A800, H800, and L40S series. This means that China’s tech giants like ByteDance, Baidu, Alibaba, and Tencent are gonna have to slow down on their appetite for NVIDIA’s high-end AI servers. They used to make up about 5-6% of the global AI server market, but now they’re gonna be down to a measly 3-4%. It’s gonna be a shake-up, man.
But here’s what we can expect in the short term. Chinese CSPs are gonna hustle to stockpile AI servers while they can. They’re gonna try to get their hands on NVIDIA’s scarce H800 resources for their customers in China. It’s gonna be a frenzy, man.
Now, in the long term, here’s what’s been happening. Because of all the AI chip bans the US has been imposing since 2022, Chinese companies have been stepping up their game. They’re developing their own independent chips, you know? Like Alibaba’s Pingtouge is getting in the ASIC ring, and Huawei is investing in its Ascend series to build up a local ecosystem in China. Plus, we got Chinese chip makers like Pingtouge and Hanergy going all out in the edge AI realm, focusing on smaller models and inference chips. They’re gonna speed up their development efforts, man. It’s gonna get intense.
Here’s the deal with the AI chip makers like NVIDIA and AMD. They gotta adapt to these new US regulations, right? So they’re gonna have to come up with products that comply. That means developing more diverse solutions to deal with all the global geopolitics. They might expand their product lines with more modest TFLOPS prowess or go for bigger die sizes. They gotta meet the requirements without losing their market reach, man. It’s gonna be a challenge.
But here’s where things get interesting. The fresh sanctions might change the game for China’s main patrons like BBAT and academic institutions. It might push them to look outside of China for AI training resources. They might start renting AI training resources from other regions. This could really amp up the work in the Large Language Model arena and help with training, fine-tuning, and AI inference for smaller models back in China. It’s a strategy, man.
And this could also make NVIDIA step up their game. They might push their DGX cloud subscription and leasing model even harder. They got those A100 and H100 AI servers that could really reel in the Chinese customers. But they could also look beyond China, man. They could offer a more versatile solution to customers in other regions dealing with their own geopolitical issues. It’s all about adapting and expanding, man.
So, if you wanna get more info on reports and market data from TrendForce’s Department of Semiconductor Research, just click here or email Ms. Latte Chung. And for all the latest tech industry news, trends, and forecasts from TrendForce analysts, head over to their website. Stay informed, man.
Next Article
China’s Share in Mature Process Capacity Predicted to Hit 29% in 2023, Climbing to 33% by 2027, Says TrendForce