The Beijing-based AI and internet search giant’s upgraded platform delivers a speedier network connection, increased computing power, and enhanced model training and inferencing capabilities, according to executive vice-president Shen Dou, who also serves as president of Baidu AI Cloud Group, speaking at a corporate event held in China’s capital on Thursday.Baige’s inferencing system – backed by adaptive and smart resource allocation technologies, which speed up data throughput and lessen communications latency – improved the inferencing efficiency of DeepSeek’s R1 reasoning model by around 50 per cent, according to Shen. Inferencing is the process that a trained AI model uses to draw conclusions in response to human queries.
“That means, with the same time and cost … we could have the model ‘think’ 50 per cent more [or] work 50 per cent more,” Shen said.
The launch of Baige 5.0 reflects the growing efforts across the mainland’s AI and semiconductor sectors to push forward the development of a domestic AI technology stack, reducing the impact of US trade restrictions on China.Baidu executive vice-president Shen Dou serves as the president of the company’s AI Cloud Group. Photo: Handout
Shen said the Kunlunxin Super Node, which supports hundreds of interconnected AI chips, had gone live on the Baige 5.0 platform, making it capable of deploying and running a trillion-parameter AI system within minutes. The number of parameters in an AI model dictates the size and complexity of its use.