The Risks of Using AI to Answer Tax Questions
- Tax Nightmares Simplified

- Mar 18
- 3 min read
Why Convenience Does Not Equal Compliance
Artificial intelligence is rapidly changing how people search for answers — including answers to complex tax questions. AI tools and large language models (LLMs) can instantly generate explanations about deductions, credits, and filing strategies. While this speed feels empowering, it creates a growing risk: AI-generated tax answers do not guarantee legal accuracy or substantial authority under U.S. tax law.
Tax law is not simply information retrieval. It is legal interpretation supported by authoritative sources, facts, and professional judgment.
The Problem With AI Tax Answers
AI systems rely on multiple publicly available sources when generating responses. These sources may include articles, summaries, blogs, or generalized tax explanations. Because of this, several critical risks arise:
The taxpayer may not ask the question correctly.
AI may misunderstand missing facts unique to the taxpayer’s situation.
The answer may not rely on legally recognized authority.
Limitations or exceptions may not be disclosed.
A quick AI response can sound confident while lacking the legal foundation required to support a tax position on a filed return.
What Is “Substantial Authority”?
In U.S. federal tax law, substantial authority is the legal standard used to determine whether a taxpayer or tax preparer has sufficient support for a tax position, primarily to avoid accuracy-related penalties.
A position meets this standard when the weight of supporting authorities is substantial compared to contrary authorities.
Not all information qualifies as authority.
Sources Considered Substantial Authority
Under Treasury Regulations, recognized authority includes:
Primary Authority
Internal Revenue Code (IRC)
Treasury Regulations
Statutory provisions
Administrative Guidance
Revenue Rulings
Revenue Procedures
Private Letter Rulings (PLRs)
Technical Advice Memoranda (TAMs)
IRS official releases
Judicial Authority
Court decisions (unless overturned or modified)
Legislative History
Congressional committee reports reflecting legislative intent
Importantly, the Internal Revenue Service communicates interpretations through many publications, but only guidance published in the Internal Revenue Bulletin (IRB) is considered authoritative. Materials outside the IRB may help taxpayers understand tax concepts but may not clarify limitations or legal reliance.

Why AI May Fall Short
Most AI tools are not programmed to verify whether their answers rely on substantial authority. As a result, taxpayers may unknowingly take unsupported positions on their tax returns.
Currently, only a limited number of specialized AI tax research platforms claim to reference substantial authority, often at significant cost and without real-time legal updates. This leaves many widely used AI tools operating without clear verification standards.
The risk? The taxpayer—not the software—is responsible.
The Liability Gap
A major issue rarely disclosed is liability.
Self-preparation software and AI platforms:
Cannot sign a tax return.
Are not recognized tax professionals by the IRS.
Do not assume responsibility for tax positions taken.
If an AI-generated answer is incorrect, penalties and interest apply to the taxpayer — not the AI developer or software company.
Tax Law Requires Investigation, Not Just Answers
The Internal Revenue Code contains layers of rules, exceptions, elections, and interpretations that apply differently to every taxpayer.
In a professional tax discovery interview, one client question typically leads to multiple follow-up questions. Tax professionals gather facts, verify documentation, and weigh authorities before determining whether substantial authority exists.
AI, by contrast, answers the question asked — not the questions that should have been asked.
Even Self-Preparation Tax Software Requires Professional Oversight
The IRS expects tax professionals to verify software performance and accuracy. Self-preparation software is not perfect, and determining whether an AI tool properly references authoritative guidance requires technical and legal expertise.
An untrained taxpayer cannot realistically evaluate:
How an AI model was trained
Whether it references authoritative sources
Whether conclusions rely on civilian publications rather than tax law
The Bottom Line
AI is a powerful educational tool, but it is not a substitute for professional tax judgment.
Tax compliance is not about finding an answer quickly — it is about supporting a position defensibly under the law.
When taxpayers rely solely on AI for tax decisions, they risk:
Unsupported filing positions
Missed legal nuances
Accuracy-related penalties
Increased audit exposure
Convenience should never replace compliance.
Moving Forward Responsibly
AI will continue to play a role in tax education and research. However, responsible use means understanding its limitations. The safest approach combines technology with professional expertise — ensuring tax decisions are backed not just by information, but by substantial authority.
Because when it comes to taxes, the real question is not “What answer did I get?”
It is “Can this position be defended under tax law?”


Comments