Every so often, ChatGPT does something frustrating. In an effort to answer your request, even when it gets stuck, it will throw out information that’s either incorrect or has been sourced badly.

Unless you spend the time going through every single fact, figure, and opinion the chatbot gives you, it can be hard to know when this is happening. With that in mind, I’ve started trying something new.

You may like

Quite simply, it involves asking ChatGPT to provide its citations for what it is telling you up front. Here’s how to do it.

Google News

Follow Tom’s Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.