Definition & Explanation
Tokens are the fundamental units that LLMs use to process text. Rather than processing characters or words directly, models split text into tokens—which can be whole words, parts of words, or punctuation marks. The number of tokens affects both the cost of using an AI model and what fits within its context window. For AI coding tools, tokens are used to measure how much code the model can read and generate in a single session.