Definition
A service that acts as an intermediary, directing requests to different Large Language Models (LLMs) based on predefined rules, costs, or performance metrics. It helps manage and optimise the use of various AI models.
Why it matters (in Poovi’s context)
LLM routers simplify the process of using multiple AI models, allow for unified billing, and enable dynamic switching for cost optimisation. Services like Requeste and OpenRouter are examples.
Key properties or components
- API aggregation
- Model routing
- Usage analytics
- Unified billing
- Cost management
Contradictions or debates
None.