• Ledger Lowdown
  • Posts
  • The AI Tax Trap: What Happens When ChatGPT Gets Your Client's Return Wrong

The AI Tax Trap: What Happens When ChatGPT Gets Your Client's Return Wrong

A CNBC reporter just learned the hard way why 1 in 5 taxpayers using AI for tax prep is a problem.

He sold company stock through an employee stock purchase plan (ESPP). ESPPs let employees buy stock at a discount, which means tricky tax rules around capital gains. So he did what millions of Americans are doing this season: he asked ChatGPT.

The bot was confident. It broke down the rules into bullet points. It analyzed his 1099. It told him, "This is great - [your brokerage] actually gave us everything we need." It walked him through the numbers and assured him, "You do NOT need a CPA."

He was ready to file. Then he ran it by Miklos Ringbauer, a CPA.

Ringbauer's verdict: "possibly correct but also incomplete." The W-2 check that ChatGPT said wasn't a big deal? Actually quite important. A few numbers in the 1099 suggested taxable moves the reporter may not have made. ChatGPT never mentioned them - because he didn't think to ask.

"The question becomes does the taxpayer have necessary understanding of the documents they look at to understand and correct any items that needs to be addressed?" Ringbauer says. "In my understanding, many of our clients do not."

That's the trap. AI doesn't lie. It just sounds so convincing that you stop questioning it.

J.T. Eagan, a clinical assistant professor of accounting at Purdue, puts it bluntly: "AI will convince you that the sky is green."

He tested a chatbot on a tax question he gives his students. The mechanics were perfect. The answer was wrong. "It gave me this response that the mechanics were perfect, but I had to take a step back and say, 'Well, you're wrong.'"

For CPAs, this is the new risk profile: clients showing up in April with AI-generated returns that look right but are subtly wrong. The kind of errors that trigger audits or leave deductions on the table. The kind that take 10 minutes to spot and an hour to fix.

Jordan Wilson, founder of AI strategy company Everyday AI, says the problem is baked into how these models work. "By default, large language models are trained to be helpful assistants. That means they're often going to sound very confident, and oftentimes you're going to run into hallucinations."

ChatGPT did warn the reporter at the top of the conversation: "If anything gets very specific or high-stakes, I'll flag where you might want a CPA's input." But then it told him his return was "very simple" and he didn't need a CPA. Mixed signals.

And if you make a mistake? "A valid excuse isn't, 'The AI made me do it,'" Wilson says.

So what should CPAs tell clients who ask about using AI for taxes?

First, check which model they're using. Free versions of ChatGPT or Claude default to "fast" modes that aren't calibrated for complex topics. Paid versions offer "thinking" modes that break down problems step-by-step. If your client is using the free version, they're probably not getting the best output.

Second, watch out for old data. AI models train on data that can be months or years old. Tax law changes in 2026? The model might not know. It'll pull information from the internet - maybe a 2024 article, maybe 2026. The difference can swing results wildly.

Third, remember that "correct" doesn't mean "right for you." A chatbot might perfectly explain that you can't deduct a pet - except you can deduct a seeing-eye dog as a medical expense. If your client asks, "Can I deduct my dog?" and AI says yes, they might stop reading after the first sentence. Context matters.

Fourth, AI only knows what you tell it. If your client uploads a 1099 but forgets to mention their W-2 has a critical line item, the AI won't catch it. It can't ask follow-up questions it doesn't know to ask. Absent a holistic view, it's set up to fail.

Fifth, turn off model training if using free versions. Paid versions let you disable data sharing. Free versions may use your tax info to train future models. That's not a data breach risk, but it's still not something you want.

The pattern is clear: AI is good at answering the question you ask. It's terrible at asking the questions you forgot to ask.

For CPAs, this creates two opportunities:

1. Position yourself as the "AI validator." Clients will use AI whether you tell them to or not. Offer a service where they bring their AI-generated return to you for review before filing. Charge for the review. Flag the errors. Save them the audit.

2. Educate clients now, before April 15. Send an email. Post on LinkedIn. Explain the ESPP trap. Share the "AI made me do it" line. Make it clear that AI is a tool, not a replacement for professional judgment.

The IRS isn't going to accept "ChatGPT told me I could" as an excuse. Neither will an auditor. And your client who saved $500 by skipping your services is going to pay $5,000 when the IRS comes calling.

The reporter's takeaway? He's hiring a CPA.

Your clients should too.