Technique Aktualisiert 2026-04
Embedding
Definition
An embedding is a numerical representation (vector) of text or data, capturing its semantic meaning.
Siehe auch im Glossar
V
Vector Database
A vector database stores embeddings for semantic search and RAG at scale.
R
RAG (Retrieval-Augmented Generation)
RAG is a technique that connects an LLM to external data sources to generate more accurate and up-to-date answers.
L
LLM (Large Language Model)
An LLM is an AI model trained on billions of texts, capable of understanding and generating human language.
N
NLP (Natural Language Processing)
NLP is the field of AI that enables machines to understand, interpret and generate human language.
Tools, die embedding verwenden
Häufig gestellte Fragen
What are embeddings used for?
They measure semantic similarity between texts. Two sentences with the same meaning will have close vectors. It's the foundation of RAG.
Are embeddings and tokens the same?
No. A token is raw text. An embedding is its translation into a numerical vector in mathematical space.