Technique Updated 2026-04
Grounding
Definition
Grounding connects LLM responses to factual data sources to reduce hallucinations and increase reliability.
See also in the glossary
R
RAG (Retrieval-Augmented Generation)
RAG is a technique that connects an LLM to external data sources to generate more accurate and up-to-date answers.
A
AI Hallucination
An AI hallucination is a response generated by an AI model that appears plausible but is factually incorrect or fabricated.
L
LLM (Large Language Model)
An LLM is an AI model trained on billions of texts, capable of understanding and generating human language.
A
AI API
An AI API allows developers to integrate artificial intelligence capabilities into their applications.
Tools that use grounding
Frequently Asked Questions
Are grounding and RAG the same?
RAG is a form of grounding. Grounding is the general concept (anchoring responses in reality), RAG is the specific technique (retrieval + generation).
Which tools use grounding?
Perplexity (web search), Gemini (Google Search grounding), NotebookLM (grounding on your documents), Consensus (grounding on scientific papers).