Technique Aktualisiert 2026-04

Context Window

Definition

The context window is the maximum amount of text an LLM can process in a single request.

Häufig gestellte Fragen

Which LLM has the largest context window?
Gemini 2.0: 1M tokens. Claude Opus: 200K tokens. GPT-4o: 128K tokens.
What happens if you exceed it?
The model forgets the beginning or rejects the request. RAG works around this limitation.