Core Concepts

Context Window

The maximum amount of conversation an AI model can 'remember' at once.

The context window is the amount of text — measured in tokens — that an LLM can process in a single interaction. It includes the system prompt, the entire conversation history, and any retrieved knowledge. A larger context window means the AI can remember more of the conversation and provide more coherent responses over long sessions. For website chat agents, context window size determines how many back-and-forth exchanges the agent can handle before losing track of earlier details.

Related Terms