September 10, 2025
Story
In a recent announcement, Arm has released its new Lumex advanced compute subsystem (CSS) platform, which the company says is designed for AI experiences on flagship smartphones and next-gen PC.
In the blog post announcing the new platform, Chris Bergey, SVP and GM of the Client Line of Business at Arm, wrote that Lumex unites the company’s highest performing CPUs with Scalable Matrix Extension version 2 (SME2), GPUs and system IP, in order to enable the ecosystem to bring AI devices to market faster and deliver experiences from desktop class mobile gaming to real time translation, smarter assistants, and personalized applications.
He also predicted that, by 2030, SME and SME2 will add over 10 billion TOPS of compute across more than 3 billion devices, delivering an exponential leap in on-device AI capability.
Accoridng tothe release, Arm partners can choose exactly how they build Lumex into their SoC – they can take the platform as delivered and leverage cutting-edge physical implementations tailored to their needs, reaping time to market and time to performance benefits. Alternatively, partners can configure the platform RTL for their targeted tiers and harden the cores themselves.
The platform combines:
Next-generation SME2-enabled Armv9.3 CPU cluster including C1-Ultra and C1-Pro, powering flagship devices
New C1-Premium, purpose built for the sub-flagship market, providing best in class area efficiency
New Mali G1-Ultra GPU with next-generation ray tracing enabling advanced graphics and gaming, plus a boost to AI performance
The most flexible and power-aware DynamIQ Shared Unit (DSU) Arm has delivered to date: C1-DSU
Optimized physical implementations for 3nm nodes
Deep integration across the software stack delivering seamless AI acceleration for developers using KleidiAI libraries
Accelerated AI Everywhere with SME2-Enabled CPUs
The SME2-enabled Arm C1 CPU cluster reportedly provides dramatic AI performance gains for real-world, AI-driven tasks:
Up to 5x uplift in AI performance
4.7x lower latency for speech-based workloads
2.8x faster audio generation
This leap in CPU AI compute enables real-time, on-device AI inference capabilities, providing users with smoother, faster experiences across interactions like audio generation, computer vision, and contextual assistants, Arm said.
For developers, AI experiences will just work on the Lumex platform. Through the KleidiAI integration across major frameworks including PyTorch ExecuTorch, Google LiteRT, Alibaba MNN and Microsoft ONNX Runtime, apps automatically benefit from SME2 acceleration with no code changed required.
For developers building cross-platform apps, Lumex is designed to bring new portability, too:
Google apps like Gmail, YouTube and Google Photos are already SME2-ready, ensuring seamless integration as Lumex-based devices hit the market
Cross platform portability means optimizations built for Android can seamlessly extend to Windows on Arm and other platforms
Partners like Alipay are already showcasing on device LLMs running efficiently with SME2
Technology leaders – including Apple, Samsung, and MediaTek – are integrating AI acceleration capabilities for faster, more efficient on-device AI. Apple is powering Apple Intelligence; Samsung and MediaTek are improving responsiveness and efficiency of real-time AI applications such as translation, summarization, and personal assistants using Google Gemini.
“SME2-enhanced hardware enables more advanced AI models, like Gemma 3, to run directly on a wide range of devices. As SME2 continues to scale, it will enable mobile developers to seamlessly deploy the next generation of AI features across ecosystems. This will ultimately benefit end-users with low-latency experiences that are widely available on their smartphones,” said Iliyan Malchev, Distinguished Software Engineer, Android at Google.
Check out the full post for all the details, but this is good news for consumer and mobile computing and th enegineers looking to get a competitive edge.
Ken Briodagh is a writer and editor with two decades of experience under his belt. He is in love with technology and if he had his druthers, he would beta test everything from shoe phones to flying cars. In previous lives, he’s been a short order cook, telemarketer, medical supply technician, mover of the bodies at a funeral home, pirate, poet, partial alliterist, parent, partner and pretender to various thrones. Most of his exploits are either exaggerated or blatantly false.