Large language models (LLM) sometimes exhibit a behavior called "hallucination" that presents unreal information as real. In this case, harm may occur to the user or third party.
Typical Hallucination Scenarios
- Fake academic article reference.
- Fabricated information about a real person (court decision, injunction).
- Medical advice (outside the context of a doctor).
- Legal advice (without client-attorney relationship).
- Fake quote / source.
Responsibility Persons
- User: Using AI output without verifying it.
- AI Provider:The company that developed and delivered the model.
- Platform: Application that integrates AI.
Provider Liability — Limited
- Terms of use generally include a "no warranty" provision.
- The "Verify output" warning increases user responsibility.
- However, within the scope of consumer law, these non-liability clauses are not valid indefinitely.
- TKHK Article 5 - Prohibition of unfair conditions.
User Liability
- Using AI output without verifying the source is negligence.
- Negligence is aggravated in professional use (lawyer, doctor, journalist).
- Turkish Code article 49 liability if a third person is harmed.
Harm to Third Person
If the AI output contains fabricated information about a person (e.g. "wrongful conviction of person X") and this is shared:
- Aggrieved person → TMK article 25 (attack on personal rights).
- Pecuniary damage (TBK article 49).
- Content removal request.
- The right of recourse to the provider is controversial.
Supreme Court — Expected Approach
The Supreme Court chambers will consider that the person who disseminates or uses the AI output "as is" will be primarily liable, while the AI provider may be indirectly liable within the framework of "product safety and disclosure obligation". In professional use, user liability for fault becomes heavier.
Levels of "Verification Obligation"
- Individual user: Reasonable control.
- Professional user (lawyer, doctor): High control — professional standard.
- Journalist: Journalism ethics — source verification.
- Public official: Supreme control.
Practical Tips
AI hallucination files are new. IT and compensation law lawyer can evaluate together.