Universal Claude API Proxy for Multiple Providers
Click any provider above to learn more, or scroll to see all options
High-performance reasoning models with excellent cost efficiency. Perfect for complex coding tasks.
Industry-leading GPT models including GPT-4o and GPT-4o-mini for diverse AI applications.
Advanced Chinese AI models with strong multilingual capabilities and long context support.
Access to multiple AI models through a single API, including Claude, GPT, and open-source models.
Model proxy service - Provides access to Claude and other models. Click to register and get $100 free credits!
Chinese AI infrastructure platform providing access to various domestic and international models.
Supports OpenRouter and AnyRouter by default. You can use them immediately with your API key.
Required for DeepSeek, OpenAI, Kimi, SiliconFlow and other providers. You must deploy your own instance to configure these providers.
Get the official Claude Code CLI tool from Anthropic.
Sign up with any supported provider and obtain your API key:
Add these to your shell config (~/.bashrc
or ~/.zshrc
):
Then reload your shell:
Run claude
in your terminal and enjoy Claude with your preferred AI provider
For maximum data security and to access all AI providers beyond OpenRouter
Our shared instance only supports OpenRouter. Self-deploy to use DeepSeek, OpenAI, Kimi, SiliconFlow, and more.
Your API keys and requests never pass through third-party servers when you control the infrastructure.
Configure any provider, custom model mappings, and deployment settings according to your needs.
No request logging or data retention when you deploy your own instance to Cloudflare Workers.
Recommended: Use Wrangler secrets for secure configuration: