{"id":257791,"date":"2025-11-02T15:08:11","date_gmt":"2025-11-02T15:08:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/257791\/"},"modified":"2025-11-02T15:08:11","modified_gmt":"2025-11-02T15:08:11","slug":"ai-is-slick-and-convenient-but-dont-trust-it-with-your-money","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/257791\/","title":{"rendered":"AI is slick and convenient. But don\u2019t trust it with your money"},"content":{"rendered":"<p>All three models also promoted known high-risk investment strategies such as encouraging people to buy \u201chot\u201d stocks that had seen a lot of recent trading over those with steadier rates of return, and encouraged people \u2013 irrespective of their skill level or knowledge of the market \u2013 to engage in actively managed funds and stock picking over broad index funds.<\/p>\n<p>Loading<\/p>\n<p>The researchers also found that even when the models were told, \u201cI don\u2019t want to pay management fees\u201d, as a way of trying to divert any potential bias, the impact of these prompts were limited.<\/p>\n<p>Another concerning discovery was the conviction with which this advice was offered up. As one of the lead researchers, Philipp Winder, noted, \u201cLLMs deliver financial advice with a convincing tone of confidence and care, often wrapped in disclaimers, but this veneer of trust can mask real financial risks.\u201d<\/p>\n<p>In other words, they have the same ability to quickly earn your trust that a slick salesperson does. Except, unlike actual human beings, these models are in your pocket, on-call 24\/7, and there\u2019s little to no transparency about what their advice is actually based on.<\/p>\n<p>That\u2019s because, for the most part, we still don\u2019t actually know what these platforms are being trained on. We know that they consume vast amounts of information but it\u2019s still unclear if they know how to prioritise advice, or if they consider the content from a peer-reviewed research paper written by subject-matter experts to be of equal value to that of a teenage YouTuber.<\/p>\n<p>And here\u2019s where another moment for pause comes in. Large language models are well-known for their propensity to hallucinate. Yep, you read that right. How does AI hallucinate, you might be wondering?<\/p>\n<p><img decoding=\"async\" alt=\"Large language models are well known for their propensity to hallucinate.\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/10\/1aaae958f8425e1c1fd5a028ebeaf3f10d7742ff.jpeg\" height=\"390\" width=\"584\" \/><\/p>\n<p>Large language models are well known for their propensity to hallucinate.Credit: iStock<\/p>\n<p>According to experts, LLMs are trained to understand that their most important task above all else is to provide information. The accuracy of that information, it seems, is also important, but not the most important thing.<\/p>\n<p>So where there are information gaps, rather than simply say they don\u2019t know, they hallucinate and fill in the blanks with assumptions that can often turn out to be wrong.<\/p>\n<p>As Winder explains, these LLMs \u201cjust try to predict the next word, so it\u2019s not that they are super, super knowledgeable\u201d. And because we don\u2019t know what they\u2019ve been fed, it\u2019s impossible to fully understand how they\u2019ve reached the conclusion that, say, US tech stocks are a better investment option than a low-risk ETF.<\/p>\n<p>I don\u2019t know about you, but if I\u2019m investing $10,000 I want to know what is informing the person advising me of where that money should go.<\/p>\n<p>Something I see often is that there\u2019s a real sense of shame among people who don\u2019t understand the basics of personal finance. Despite the vast majority of us not receiving any formal education around money management, for some reason we still seem to believe that these are things we should all just inherently know about by the time we reach a certain age or milestone in life.<\/p>\n<p>When we don\u2019t, we feel embarrassed. So it makes sense that someone would prefer to ask an AI platform to explain the differences between fixed term and variable mortgage rates, instead of asking a financial planner or bank employee and risk feeling stupid.<\/p>\n<p>What\u2019s more, the free and easy access to LLMs makes financial education more accessible and easier to understand, which is only a good thing. But when it comes to making financial decisions, people wanting to seek advice from AI should follow the same rules that would apply to any other person or platform \u2013 as one source of advice but definitely not the only one.<\/p>\n<p>Wanting to understand different kinds of savings accounts or how to create a basic household budget is one thing but sharemarket advice and hard-earned savings is another. Especially when the platform giving that advice has stakes in the game, and no rules apply.<\/p>\n<p>Victoria Devine is an award-winning retired financial adviser, a bestselling author and host of Australia\u2019s No.1 finance podcast, <a href=\"https:\/\/www.shesonthemoney.com.au\/\" rel=\"noopener nofollow\" target=\"_blank\">She\u2019s on the Money<\/a>. She is also founder and director of Zella Money.<\/p>\n<p>Advice given in this article is general in nature and is not intended to influence readers\u2019 decisions about investing or financial products. They should always seek their own professional advice that takes into account their personal circumstances before making any financial decisions.<\/p>\n<p>Expert tips on how to save, invest and make the most of your money delivered to your inbox every Sunday. <a href=\"https:\/\/www.theage.com.au\/link\/follow-20170101-p5d9o2\" rel=\"nofollow noopener\" target=\"_blank\">Sign up for our Real Money newsletter<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"All three models also promoted known high-risk investment strategies such as encouraging people to buy \u201chot\u201d stocks that&hellip;\n","protected":false},"author":2,"featured_media":257792,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-257791","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/257791","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=257791"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/257791\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/257792"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=257791"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=257791"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=257791"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}