Last updated: 2026-04-18

What is Hallucination?

When an LLM generates confident but false content — made-up API signatures, fake citations, invented file paths. RAG, tool use (verify by running), and "read the source" patterns all reduce but don't eliminate hallucinations.

← Back to the full AI agent glossary.