Canadian businesses and individual taxpayers are increasingly turning to general-purpose artificial intelligence (AI) tools such as ChatGPT for bookkeeping and tax questions, a trend financial professionals say could lead to costly mistakes, false confidence and potential run-ins with the Canada Revenue Agency (CRA).
In 2025, 76 per cent of accountants and bookkeepers said they saw an uptick in business clients using large language models for tax or bookkeeping advice, and were spotting mistakes on a regular basis, according to a survey of 500 accountants and bookkeepers across Canada commissioned by Dext, a global AI-powered bookkeeping firm.
The most common mistakes include misinterpretation of business expenses (44 per cent), incorrect tax claims or charges (43 per cent), faulty personal tax planning (36 per cent), payroll errors (35 per cent), and incorrect business tax planning advice (35 per cent).
Those errors can create productivity losses for accountants as they fix avoidable mistakes, while business owners — and individual taxpayers — can wind up paying for extra hours.
Dext’s survey also warns that the consequences go beyond money and time wasted. A quarter (27 per cent) of respondents warn of a higher risk of insolvency or business failure, while others expect increased misuse of AI outputs to justify inappropriate or fraudulent claims (42 per cent), rising fines and penalties (40 per cent), and greater CRA scrutiny due to incorrect or late filings (38 per cent).
“There’s always risk of errors posed when you’re using generative AI tools for things like calculation and automation,” said Melissa Robertson, principal, research and thought leadership at CPA Canada.
If AI tools are being used, the outputs need to be sufficiently reviewed, she adds. However, that becomes more difficult as transaction volume increases. There needs to be checkpoints to validate what the tool is doing, followed by a review of what’s actually been completed.
For individuals, the same risk of error applies regardless of the accounting area involved, Robertson says.
“I think there’s a lot of overpromise in what some AI tools can do right now, and there is a lot of risk when organizations just take those tools at face value.”
Ryan Minor, the CPA’s director of tax, says while AI can be useful for locating documents, it may not always surface current material that reflects the CRA’s latest position. Users should always visit the source document before following any advice or suggestions AI offers, he adds.
“If you’re not in the industry and you don’t have a hunch what the answer would be, you may be misled,” Minor said.
Jason Heath, managing director of Objective Financial Partners, says he sometimes receives copy-and-paste information from clients using AI tools to validate a tax or financial concept. While the information is often mostly correct, it may contain gaps, fail to apply to the client’s situation, or even mix up U.S. and Canadian tax rules.
Regardless of quality, instant access to free advice is appealing, particularly as legal, accounting and financial advice typically comes at a cost. But if users don’t ask the right questions or don’t prompt the tools thoroughly, the results can be misleading.
One of the more common tax mistakes Heath sees relates to sales taxes on short-term rentals listed on Airbnb. In real estate, errors can cost hundreds of thousands of dollars depending on the property type, and those tax mistakes might not even be discovered until years later.
“The CRA isn’t going to provide leniency because you relied on AI advice that you thought was correct, nor would it if it were professional tax advice that was incorrect,” he said.
Heath says individuals should treat AI as a research tool rather than a substitute for professional advice.
“I’d be hesitant to rely on AI,” he said, “but I would be supportive of using it to further your knowledge and help you ask the right questions.”