Glossary term
Context window
The amount of input a model can hold during one reasoning pass.
Definition
This is the text and token range an LLM can handle in one reasoning pass. Long documents should expose headings, summaries, and definition blocks early so less signal is lost inside that limit.
Working notes
- Structure matters more than raw length if the goal is fast retrieval.
- Topic hubs and glossary pages act as compressed entry points before the long-form post.
Related topics
Related posts