CANADA – 2025/08/29: In this photo illustration, the Alibaba Group logo is seen displayed on a smartphone screen. (Photo Illustration by Thomas Fuller/SOPA Images/LightRocket via Getty Images)
SOPA Images/LightRocket via Getty Images
Alibaba stock (NYSE:BABA) experienced an increase of nearly 13% in trading on Friday, reaching approximately $135 per share. BABA stock is also up by close to 60% year-to-date. The recent increase follows a favorable Q1 earnings report, which indicated a rise in sales for Alibaba’s cloud business. Furthermore, reports have emerged that Alibaba has created a new AI chip for its cloud computing division. While the new chip is not intended to compete with Nvidia (NASDAQ:NVDA) Hopper series of chips, much less the new Blackwell lineup, that’s not the actual goal. The focus is on ensuring Alibaba can secure a supply of AI semiconductors despite U.S. export restrictions while also enhancing the competitiveness of its cloud business as AI adoption continues to rise.
What We Know About The Chip
Alibaba’s T-Heat unit has been engaged in the development of AI chips for several years. In 2019, the company launched the Hanguang 800, which was targeted at traditional machine learning models. However, with this new chip, Alibaba is delving deeper into the AI hardware competition, working with large language and diffusion models that are the foundation of today’s generative AI technology. The new chip is specifically designed for inference workloads—the phase where trained models produce answers and recommendations in real-time.
It is anticipated to be manufactured using a 7 nanometer process, making it significantly more capable and versatile than the previous Hanguang chip. Early reports indicate it will concentrate on workloads such as recommendation systems and natural language processing. Notably, the chip is rumored to be compatible with Nvidia’s software ecosystem, which may allow programmers to modify and reuse existing code. How Snowflake Stock Surges To $450
This development also takes place against a backdrop of geopolitical tension. The U.S. has prohibited leading-edge chip exports to China, with Nvidia’s H20 being restricted earlier this year. Though shipments have since been conditionally permitted, Chinese companies still face uncertainty, and reports suggest that Beijing has advised companies to refrain from relying on the H20. By creating its own chip, Alibaba can diminish its reliance on U.S. suppliers while meeting China’s rapidly increasing need for AI capabilities.
Why Inference Matters Now
The initial growth phase of AI was propelled by the training of large models, a process that established Nvidia as the clear victor as its GPUs became the benchmark. The company’s revenues have surged, increasing from $27 billion in FY’23 to an anticipated $206 billion in FY’26. However, training is anticipated to be a front-loaded, compute-intensive process that may start to show diminishing returns, with larger models not consistently producing proportional gains in accuracy. The availability of high-quality training data is also becoming a limiting factor, as much of what is readily available online has already been utilized in current models. In contrast, inference occurs continuously in production. It is less intensive per task but scales across millions of users and applications. This is the segment Alibaba is targeting with its new chip. Although inference chips do not require the same raw power as Nvidia’s leading training GPUs, they are still expected to be a crucial component of AI systems.
Alibaba’s Strategy
Unlike Nvidia, Alibaba is unlikely to sell its chip directly to users. Instead, it will leverage the hardware to enhance Alibaba Cloud, allowing customers to rent computational power. This approach should deepen customer dependency on Alibaba’s ecosystem while aiding in the generation of recurring revenues. Possessing both the hardware and the cloud platform also facilitates tighter integration, which could lead to efficiency improvements. Alibaba is supporting this strategy with financial backing, committing 380 billion yuan (approximately $53 billion) towards AI infrastructure within the next three years. The company has substantial motivation for this. Its cloud division experienced a 26% year-on-year growth last quarter, and AI-associated revenue has achieved triple-digit growth for eight consecutive quarters.
In summary, the new chips are likely to supplement Nvidia’s GPUs in Alibaba’s broader AI strategy. The company will probably continue to depend on Nvidia hardware for training AI models in the short term, while its own chips focus on powering cloud-based inference on a large scale. Other Chinese companies are also intensifying efforts in developing AI chips. Baidu, Huawei, and startups such as Cambricon are all working on AI semiconductors. However, Alibaba’s established presence in cloud computing gives it a distribution advantage. It can swiftly integrate new chips into its extensive data centers and monetize them through its existing customer base.
The Trefis High Quality (HQ) Portfolio, which consists of 30 stocks, has a history of reliably outperforming its benchmark that includes all three – S&P 500, Russell, and S&P midcap. What accounts for this? As a collective, HQ Portfolio stocks have generated better returns with reduced risk in comparison to the benchmark index; producing a smoother ride, as illustrated in HQ Portfolio performance metrics.