People turn to AI for just about everything these daysâfrom workout routines to business strategy. So itâs no surprise they ask about taxes too. Some questions are straightforward, like âHow does VAT apply to low-value imports?â or âWhat is ViDA?â Others are more complex, involving detailed financial scenarios and asking how to handle them from a tax perspective.
When you ask a large language model (LLM) for tax advice, it usually gives an answer. It may include a disclaimer encouraging you to consult a real expert, but it rarely refuses the request. Thatâs where things start to get complicated. In many countries, applying tax law to someoneâs specific situation is legally considered tax advice and thatâs often a regulated activity.
The rules vary depending on where you are. In the Netherlands, anyone can offer tax advice. In Germany, only licensed tax advisors can. Legal advice is also restricted to licensed lawyers. So what happens when the advice comes from a machine and not from a person?
What the Courts Say About Software
A German court ruling has already addressed this issue, offering a useful starting point for understanding where automation ends and regulated advice begins. While the case didnât involve LLMs, it still provides valuable insights.
Germanyâs Federal Court of Justice (Bundesgerichtshof) reviewed a contract-generating platform that allowed users to create legal documents by answering a guided questionnaire. The service was advertised as producing âlegal documents with lawyer-level qualityââfaster and cheaper than a real lawyer.
The question was whether this kind of software crossed into regulated territory by offering legal advice without a license. The court ruled that it didnât. It held that the platform was lawful because it didnât analyze legal issues on a case-by-case basis. Instead, it used fixed logic, relied on factual input from users, and followed a set decision tree. There was no legal interpretation, no discretion, and no human oversight. The court compared it to a sophisticated form bookâprewritten templates, not personalized legal counsel.
The court also emphasized the userâs role. It found that users were not misled into expecting full legal services. They understood that the output was a standardized document generated without professional legal review. Users knew they were responsible for the accuracy of the information they provided. Because of this, the court concluded that the service didnât qualify as unauthorized legal practice.
However, the court did draw a firm line on how the tool was marketed. While the platform itself was allowed, promotional language that claimed to deliver âlawyer-qualityâ results or positioned the service as equivalent to legal representation was ruled misleading. The takeaway: the automation may be legal, but how itâs presented to users must be honest.
So What About AI?
The German court drew a clear distinctionâautomated tools are permitted as long as they offer general guidance, not case-specific legal advice. If a tool behaves like a structured manual with templates and logic paths, itâs usually safe. But if it interprets tax rules based on someoneâs personal data, it may cross into regulated territory.
Most tax software keeps it simple. It follows a fixed path and provides logic-based results. But LLMs can go further. They respond to user input in a conversational, personalized way. If the output applies tax law to individual facts, even unintentionally, it could qualify as tax advice under strict regulatory standards.
Still, a strong case can be made that LLMs arenât giving tax adviceâat least not in the legal sense.
For one, LLMs are not legal entities. They canât be licensed, held accountable, or sanctioned. They donât act with legal intent. Like calculators or tax calculation engines, theyâre toolsânot advisors.
Second, the user is in control. They ask the questions, guide the interaction, and decide how to use the output. LLMs donât request documentation, question the facts, or assess risks like a licensed advisor would.
Third, the answers are probabilistic. LLMs donât reason through the law; they predict what might be a helpful reply based on past patterns in training data. They donât understand legal rules, evaluate ethics, or grasp the nuance of financial and personal context.
From the userâs point of view, expectations are low. Most people know LLMs hallucinate. They understand that these systems occasionally produce false or misleading information. As a result, many use them as low-cost assistants but not as replacements for professional help. And most LLMs arenât marketed as legal advisors, which helps keep them out of regulatory trouble. Itâs a different story for tools that claim to offer legal certainty or âlawyer-qualityâ adviceâthat kind of positioning can trigger legal obligations.
The Bottom Line
LLMs generate text based on patterns in the data they were trained on. They donât apply laws but predict what sounds like a useful response. Thatâs not tax advice. Thatâs automated text. And itâs up to humans to treat it that way.
As both knowledge and reasoning become automated, tax advisors must redefine their role â not as knowledge holders, but as interpreters, strategists, and ethical decision-makers. Their value no longer lies in simply knowing the rules, but in interpreting them, applying judgment, and asking the hard questions that AI canât. The goal isnât to compete with AI but to work with it.
The opinions expressed in this article are those of the author and do not necessarily reflect the views of any organizations with which the author is affiliated.
