Trending repo
Claude Code & Cursor rules for Awesome-LLM-Inference
by @xlite-dev · 5,214 stars
View on GitHub →About Awesome-LLM-Inference
📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
Topics
awesome-llmdeepseekdeepseek-r1deepseek-v3flash-attentionflash-attention-3flash-mlallm-inferenceminimax-01mlapaged-attentionqwen3
No rules target Awesome-LLM-Inference yet
No published rules, MCP servers, or skills target Awesome-LLM-Inference yet. If you maintain a tool that works well with this project, you can publish for free during beta.
Why this page exists
RuleSell tracks the AI-coding ecosystem so you don't have to. When a repo like Awesome-LLM-Inference picks up momentum, we surface the Claude Code skills, Cursor rules, MCP servers, and agent configs that target it — with real author attribution, SPDX license badges, and quality scores. Every listing ships with copy-paste install for each environment.