ai
Hallucination
AI making up incorrect information.
Definition
An AI hallucination is when an LLM generates confident-sounding but incorrect or fabricated information. Mitigated by RAG, citations, and careful prompting.
Related terms
Put Hallucination to work
Build, publish, and rank your content with RankFlo. Free to start.
Start for free