Definition
Token usage refers to the amount of data, measured in tokens (pieces of words or characters), that an AI model processes during an interaction. Many large language models have a limit on the number of tokens they can process at once (context window) and charge based on usage.
Why it matters (in Poovi’s context)
Optimising token usage is essential for reducing computational costs, improving response times, and staying within the context limits of AI models. It directly impacts the efficiency and affordability of using AI.
Key properties or components
- Measured in tokens
- Context window limitations
- Cost implications
Contradictions or debates
None.