Definition

The maximum amount of text (measured in tokens) that a large language model can process and understand in a single input or interaction, influencing its ability to grasp broader context.

Why it matters (in Poovi’s context)

A critical technical specification for memex, as Gemini 2.5 Pro’s 1M+ token context window fundamentally changes the system’s architecture by enabling one-shot analysis of entire long documents and eliminating the need for chunking.

Key properties or components

  • measured in tokens
  • determines input capacity
  • impacts model’s ability to retain long-range dependencies
  • influences system architecture

Contradictions or debates

None.

Sources