Skip to content

Trending repo

Claude Code & Cursor rules for litellm

by @BerriAI · 46,706 stars

View on GitHub →

About litellm

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

Topics

ai-gatewayanthropicazure-openaibedrockgatewaylangchainlitellmllmllm-gatewayllmopsmcp-gatewayopenai

No rules target litellm yet

No published rules, MCP servers, or skills target litellm yet. If you maintain a tool that works well with this project, you can publish for free during beta.

Why this page exists

RuleSell tracks the AI-coding ecosystem so you don't have to. When a repo like litellm picks up momentum, we surface the Claude Code skills, Cursor rules, MCP servers, and agent configs that target it — with real author attribution, SPDX license badges, and quality scores. Every listing ships with copy-paste install for each environment.