What you need to knowIndustry shift prioritizes AI experience over hardware, revolutionizing consumer tech engagementGenAI enables real-time capabilities like image editing and personalized content creation on premium smartphones.Future integration across devices promises seamless AI utility for consumers.
AI is everywhere, and GenAI has been reshaping how tech companies think about their products. Much like how Samsung just announced its partnership with Perplexity on its TVs, which allows users to look up shows, movies, and even plan trips with the AI right from their couch.
Companies like Google and Motorola have been marketing the AI experience on their new phones, more than the actual device itself, and Nothing is also putting its $200M funding to create its own AI-native devices. This means, slowly yet steadily, we see a shift in the industry, where it is no longer about the hardware; we’ve entered an era where the experience with the device matters more than its looks.
And if this device can communicate with a bunch of apps on your phone with just a voice command, to help you reserve a table at your favorite restaurant, send invites to the guests, and also block your calendar, all in one. This would basically eliminate the need to open up multiple apps, significantly reducing the time spent on these tasks.
You may like
(Image credit: Nicholas Sutrich / Android Central)
In the near future, the phone will serve as a central hub, functioning as a remote to communicate with other smart devices integrated with AI. According to a recent report by IDC, this changed only two years ago when GenAI began appearing in premium smartphones. With capabilities like real-time image editing, personalized content creation, and advanced voice interactions, oftentimes with on-device, low-latency processing. GenAI brought AI to the forefront, but its reach was limited.
IDC further notes that in a span of a few years, users will be able to interact with the same personalized AI (be it Gemini, Perplexity, and more) across smart glasses, wearables, and ambient devices. This will basically allow GenAI to pick the device that is most suited for the task at hand, for instance, if you have to move seamlessly from your phone to your headphones or car dashboard.
(Image credit: Brady Snyder / Android Central)
Devices like AI-powered smart glasses are also contributing to this shift, offering hands-free, real-time assistance, as they can basically see and hear everything the user does. Much like the new Meta Ray-Ban Display glasses, letting you do pretty much anything while keeping your phone tucked away, from checking messages to basically navigating through life, you get to do it all with just one glance at the in-lens display.
This is also what other companies like Google and Samsung are looking to bring to the table. IDC further notes that “consumers want features that solve problems, save time, and delight. For device vendors, the ones who can sell the AI experience, translating AI into everyday user value, will be the ones that stand out.”