Of course, there was the usual promotional video featuring people whose lives have been saved as the result of the watch, which toes the line between feeling genuinely heartfelt and feeling like it’s preying on people’s insecurities around sudden health emergencies to sell watches.

But either way, the Apple Watch really does notice when people have a hard fall or are in a car accident, and it can call emergency services. It can also help prevent emergencies by picking up on heart health concerns or guiding outdoor explorers back to a point they had been to previously.

At the event, Apple said its Watch Series 11 could detect high blood pressure, meaning it could alert people to an increased risk of stroke or heart attack, though that feature will require regulatory approval in each region before it works.

Fitness tracking

Aside from watches, Apple unveiled a new version of the AirPods Pro — which it claims are the world’s most popular headphones — that have integrated heart rate sensors. This is something the company introduced earlier this year in a set of fitness-focused Beats headphones, but having them in such a mainstream-friendly product could be a big deal.

The buds have a similar photoplethysmography sensor to what you’d find in a smartwatch for keeping an eye on a user’s blood flow, and they also have accelerometers, a gyroscope and GPS systems inside, so people without an Apple Watch will get the same kind of workout-tracking through the buds. We won’t know if this comes with any particular limitations until we’ve tried it ourselves, though we do know the heart tracking is active only during workouts, and that the buds have to be connected to an iPhone to do it.

The AirPods Pro 3 have upgraded waterproofing to protect them from sweat or rain (IP57 vs IPX4 on the Pro 2); they introduce a new live translation feature; they have improved noise-cancelling; and they also inherit the health-focused capabilities of the previous Pro buds.

Namely, they can function as clinical-grade hearing aids; they can administer hearing tests; and they can protect ears by lowering loud ambient sounds.

And incidentally, all of this health and fitness stuff is absolutely powered by AI. It’s just not the chatty kind; its use is isolated to specific functions, and it’s backed by a lot of research and development.

Software ecosystem

It’s all well and good for me to show health features that might really help somebody’s quality of life, and contrast it with generative AI chatbots that often do anything but. Yet, you may rightly wonder why Apple shouldn’t have both. Can’t it match all the AI tools found on Samsung and Google phones, while also keeping up its health and wearables innovations?

Loading

Of course it totally can, but it doesn’t necessarily have to build those tools. The iPhone is practically the de facto general computing platform of our era, and while that could be better reflected in some of Apple’s App Store policies, the big sensations from the likes of ChatGPT, Perplexity and Gemini will all come to iPhone.

Saying Apple needs to develop its own AI is a little like saying Apple needs to develop its own video games. Why should it need to? It owns the platform that the games are played on. It can capitalise on their popularity by running the platform’s best store, offering subscriptions and services and designing its hardware and operating system in a way that keeps developers and players coming back.

Apple already makes plenty of incredible apps and features for its own devices, and personally I see no reason for it to start integrating generative AI into all of them. By offering developers APIs that let them dig into the machine learning tech on Apple’s chips, and providing the mechanism for users to install apps and customise their devices to use whatever services they want, iPhone will become a natural home for any AI innovations. And when those innovations turn out to be inaccurate or dangerous, Apple will more easily wash its hands of them.

Industrial design

Loading

Some might not like to admit it, but what a device looks and feels like is a major factor in how enjoyable it is to use, which Apple attempted to reassert this week by evoking Steve Jobs’ arguments about form equalling function, and then unveiling an extremely thin phone it said was also its most durable yet, and a bold but divisive redesign for its iPhone Pro.

We’re not in 2012 any more, where most Android phones were dinky and weird compared to the iPhone, and there are currently many beautifully made phones from companies all over the world. But this was the first Apple event in a long time where the company seemed to lean hard on its bona fides as a design company, and I think that’s one of its major strengths. If I had to choose between phones purely by watching the iPhone Air introduction video and the Pixel 10 rundown starring Jimmy Fallon, it wouldn’t be particularly close.

It’s not all whimsical advertising and orange anodised aerospace aluminium alloy, though. Apple pushes durability and device longevity further every year, so people get a well-made product they keep for longer, or which is worth more when they decide to re-sell it. And that kind of philosophy is almost diametrically opposed to the logical conclusion of a smartphone run by AI; that physical devices will eventually disappear in favour of cloud-based voice interfaces and content services.

Apple can keep all of its strengths while adding more AI, and I’m sure it will. But training, testing and implementing generative AI in a responsible way is a massive undertaking, and a very different game to making good, reliable devices and software services. Maybe it’s the one company doesn’t need to do both, and in fact there’s something to be said for being the platform that has more important things going on.

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.