claude-code-router

Use Claude Code as the foundation for coding infrastructure, allowing you to decide how to interact with the model while enjoying updates from Anthropic.

open-sourceagent-frameworks
30.8k
Stars
+2010
Stars/month
0
Releases (6m)

Star Growth

+428 (1.4%)
29.9k30.8k31.6kMar 27Apr 1

Overview

Claude Code Router 是一个强大的路由工具,允许开发者以 Claude Code 为基础架构,将请求智能路由到不同的AI模型和提供商。该工具支持多达6个主要AI提供商(OpenRouter、DeepSeek、Ollama、Gemini、Volcengine、SiliconFlow),可根据任务类型(后台任务、深度思考、长上下文处理)动态选择最适合的模型。通过 `/model` 命令,用户可在 Claude Code 内实时切换模型,无需重启会话。该工具还提供了请求/响应转换器,支持自定义插件系统,并能与 GitHub Actions 集成,为CI/CD流程提供AI能力。拥有超过3万GitHub星标,是一个成熟且活跃的开源项目。

Deep Analysis

Key Differentiator

vs using Claude Code directly: Route different types of requests (background, thinking, long context) to optimal models from any provider, saving costs while maintaining Claude Code's UX - with dynamic switching and plugin system

Capabilities

  • Route Claude Code requests to different LLM models
  • Multi-provider support (OpenRouter, DeepSeek, Ollama, Gemini, etc.)
  • Request/response transformation with plugin system
  • Dynamic model switching via /model command
  • CLI model management
  • GitHub Actions integration
  • Preset configuration management
  • Web UI for configuration

🔗 Integrations

Claude CodeOpenRouterDeepSeekOllamaGoogle GeminiVolcengineSiliconFlowModelScopeDashScopeGitHub Actions

Best For

  • Claude Code users wanting to use cheaper/alternative models for different task types
  • Teams optimizing LLM costs by routing background tasks to cheaper models
  • Developers needing multi-provider flexibility within Claude Code

Not Ideal For

  • Users not using Claude Code
  • Teams that only use Anthropic models

Languages

TypeScript/JavaScript

Deployment

npm install -gLocal proxy server

Pricing Detail

Free: Fully free and open-source
Paid: N/A - bring your own model API keys

Known Limitations

  • Only works with Claude Code CLI
  • Requires separate LLM API keys for each provider
  • Proxy approach may introduce latency
  • Configuration requires understanding of model provider APIs

Pros

  • + 支持6个主要AI提供商的无缝切换,可根据任务需求选择最合适的模型
  • + 提供动态模型切换和CLI管理功能,操作简便且支持实时调整
  • + 可扩展的插件系统和请求转换器,允许深度定制和与现有工作流集成

Cons

  • - 需要依赖 Claude Code 作为基础框架,增加了环境配置复杂性
  • - 需要手动配置多个提供商的API密钥和参数设置
  • - 作为中间层可能引入额外的延迟和潜在的故障点

Use Cases

  • AI开发团队需要根据不同任务类型(编码、分析、创作)使用不同模型的场景
  • 希望在GitHub Actions中集成多个AI提供商能力的CI/CD自动化流程
  • 需要灵活切换AI模型以优化成本和性能的企业级AI应用开发

Getting Started

1. 安装依赖:先安装Claude Code (`npm install -g @anthropic-ai/claude-code`),再安装路由器 (`npm install -g @musistudio/claude-code-router`);2. 配置设置:创建 `~/.claude-code-router/config.json` 文件,添加各AI提供商的API密钥和路由规则;3. 开始使用:在Claude Code中使用 `/model` 命令切换模型,或通过 `ccr model` CLI命令管理模型配置。

Compare claude-code-router