Definition & Explanation
The context window is the total amount of information—including the system prompt, conversation history, code files, and outputs—that an LLM can hold in its "memory" during a single session. Larger context windows allow AI coding tools to understand more of your codebase at once. For example, Claude has a 200K token context window, enabling it to process entire large codebases in a single session. Context window size is a key factor when choosing an AI coding tool.