Go on X or LinkedIn for five minutes and you’ll find plenty of people talking about what AI can do. It can summarize meeting notes, write code, turn a photo of you into a caricature, or give your emails a more assertive vibe. Those are just a few examples I saw in LinkedIn posts earlier today.

But for all the things AI can do, there are still plenty it can’t. In fact, some limitations trip up the most popular AI tools time and time again. I’m not dunking on the technology here (I do sometimes, but that’s not what this is). I think it’s good to talk about what AI can’t do so we’re clear on its boundaries.

When people are new to AI tools, or dazzled by the hype, they can easily misinterpret what these systems actually are and what they’re capable of. That’s how we end up with reports filled with made up statistics. Of course, different AI tools have different strengths. But here are some common things your favorite AI tool might still struggle with in 2026 and, importantly, why those struggles still exist.

You may like

AI as a therapy tool. But the broad consensus tends to be: use it cautiously, and only as a supplement to real therapy.

Many people find value in sharing things with their chatbot of choice, especially given how inaccessible traditional therapy can be in many countries. They might ask ChatGPT to help interpret the tone of a text or clarify goals. But beyond that, experts warn it could do more harm than good.

Again, this all comes down to how these tools are designed. They tend to agree, reflect your views back to you and validate your experience. They are structurally optimized to be helpful and agreeable. Even with guardrails in place, they are more likely to affirm than challenge.

But true growth needs friction. It requires someone who can push back, notice blind spots, and establish boundaries. Sure, a small amount of validation can be reassuring. But too much without challenge can subtly distort how you see yourself and the world.

There are also practical limits too. An AI system can’t assess risk the way a trained professional can. It can’t intervene in a crisis and it can’t participate in the patient and therapist dynamic that makes therapy effective. It can simulate it, but misses out on the lived experience, training, professional accountability and duty of care.

Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.