Visit Website →

LiteLLM is an open-source tool that provides both a Python SDK and a proxy server (AI Gateway) that lets developers call over 100 different LLM APIs from providers like OpenAI, Anthropic, Azure, AWS Bedrock, Google Vertex AI, and others using a unified OpenAI-compatible interface. It handles authentication, load balancing, automatic fallbacks between providers, cost tracking, and spend management. The tool standardizes inputs and outputs across different LLM providers, eliminating the need to write custom code for each API, and includes features like guardrails, logging integrations with MLflow and Langfuse, and the ability to set usage budgets per team or user.

Added on

Alternatives