Launch timelinethis launch window
Submitted
Apr 26
Approved
Apr 26
Live
now
Window closes · final rank locked
May 10
About this build
↗ Read full postA lightweight proxy that routes Claude Code's Anthropic API calls to NVIDIA NIM (40 req/min free), OpenRouter (hundreds of models), DeepSeek (direct API), LM Studio (fully local), llama.cpp (local with Anthropic endpoints), or Ollama (fully local, native Anthropic Messages).
Transparent proxy: Claude Code sends standard Anthropic API requests; the proxy forwards them to your configured provider Per-model routing: Opus / Sonnet / Haiku requests resolve to their model-specific backend, with MODEL as fallback Request optimization: 5 categories of trivial requests (quota probes, title generation
Get updates from @midasdy
Changelog entries, big features, pivots — first to know. No spam, unsubscribe anytime.
The stackwhat it's built on
modelsin rotation
◆opus 4.7primary
runtimeshipped on
Cclaude-code
Discussion0
↓ jump to maker →0comments
0· 0 answeredquestions · 0 answered
0bug reports
0maker replies
filter:
Y
No comments yet.
Be the first to ask the maker something real. Don't do "congrats on the launch!" — that's what upvotes are for.
Changelogno updates yet
The maker hasn't shipped any updates yet.
Similar buildsfrom the feed