All three models also promoted known high-risk investment strategies such as encouraging people to buy “hot” stocks that had seen a lot of recent trading over those with steadier rates of return, and encouraged people – irrespective of their skill level or knowledge of the market – to engage in actively managed funds and stock picking over broad index funds.

Loading

The researchers also found that even when the models were told, “I don’t want to pay management fees”, as a way of trying to divert any potential bias, the impact of these prompts were limited.

Another concerning discovery was the conviction with which this advice was offered up. As one of the lead researchers, Philipp Winder, noted, “LLMs deliver financial advice with a convincing tone of confidence and care, often wrapped in disclaimers, but this veneer of trust can mask real financial risks.”

In other words, they have the same ability to quickly earn your trust that a slick salesperson does. Except, unlike actual human beings, these models are in your pocket, on-call 24/7, and there’s little to no transparency about what their advice is actually based on.

That’s because, for the most part, we still don’t actually know what these platforms are being trained on. We know that they consume vast amounts of information but it’s still unclear if they know how to prioritise advice, or if they consider the content from a peer-reviewed research paper written by subject-matter experts to be of equal value to that of a teenage YouTuber.

And here’s where another moment for pause comes in. Large language models are well-known for their propensity to hallucinate. Yep, you read that right. How does AI hallucinate, you might be wondering?

Large language models are well known for their propensity to hallucinate.

Large language models are well known for their propensity to hallucinate.Credit: iStock

According to experts, LLMs are trained to understand that their most important task above all else is to provide information. The accuracy of that information, it seems, is also important, but not the most important thing.

So where there are information gaps, rather than simply say they don’t know, they hallucinate and fill in the blanks with assumptions that can often turn out to be wrong.

As Winder explains, these LLMs “just try to predict the next word, so it’s not that they are super, super knowledgeable”. And because we don’t know what they’ve been fed, it’s impossible to fully understand how they’ve reached the conclusion that, say, US tech stocks are a better investment option than a low-risk ETF.

I don’t know about you, but if I’m investing $10,000 I want to know what is informing the person advising me of where that money should go.

Something I see often is that there’s a real sense of shame among people who don’t understand the basics of personal finance. Despite the vast majority of us not receiving any formal education around money management, for some reason we still seem to believe that these are things we should all just inherently know about by the time we reach a certain age or milestone in life.

When we don’t, we feel embarrassed. So it makes sense that someone would prefer to ask an AI platform to explain the differences between fixed term and variable mortgage rates, instead of asking a financial planner or bank employee and risk feeling stupid.

What’s more, the free and easy access to LLMs makes financial education more accessible and easier to understand, which is only a good thing. But when it comes to making financial decisions, people wanting to seek advice from AI should follow the same rules that would apply to any other person or platform – as one source of advice but definitely not the only one.

Wanting to understand different kinds of savings accounts or how to create a basic household budget is one thing but sharemarket advice and hard-earned savings is another. Especially when the platform giving that advice has stakes in the game, and no rules apply.

Victoria Devine is an award-winning retired financial adviser, a bestselling author and host of Australia’s No.1 finance podcast, She’s on the Money. She is also founder and director of Zella Money.

Advice given in this article is general in nature and is not intended to influence readers’ decisions about investing or financial products. They should always seek their own professional advice that takes into account their personal circumstances before making any financial decisions.

Expert tips on how to save, invest and make the most of your money delivered to your inbox every Sunday. Sign up for our Real Money newsletter.