One local endpoint for Anthropic, Gemini, and OpenAI clients β backed by the provider of your choice.
Vekil is a Go reverse proxy that exposes Anthropic, Gemini, and OpenAI-compatible APIs behind one local endpoint. Run it in zero-config mode against GitHub Copilot, or route selected models to providers like Azure OpenAI and OpenAI Codex. The client-facing API surface stays the same while model ownership is configured behind the proxy.
Use your GitHub Copilot subscription with Claude Code, point the Codex CLI at Azure OpenAI, or send Gemini-CLI traffic through any OpenAI-compatible upstream β all without touching client config. Swap providers behind the proxy; your tools never notice.
- Anthropic Messages API β drop-in compatible with Claude clients
- Gemini API β Generate Content, Stream Generate Content, and Count Tokens
- OpenAI Chat Completions and Responses APIs, including Codex websocket bridging
- Multi-provider routing across GitHub Copilot, Azure OpenAI, and OpenAI Codex
- Codex compatibility shims for compaction and memory summarization
- Streaming, tool use, parallel tool calls, compressed request bodies, and auth/token caching
Grab a binary from GitHub Releases, or run the container from GHCR:
docker run -p 1337:1337 \
-v ~/.config/vekil:/home/nonroot/.config/vekil \
ghcr.io/sozercan/vekil:latestOn Apple Silicon Macs, install the native tray app via Homebrew:
brew install --cask sozercan/repo/vekilThe app is not signed. Clear quarantine with
xattr -cr /Applications/Vekil.app. Manualvekil-macos-arm64.zipdownloads are also on Releases. See Tray App (macOS/Linux).
For explicit provider routing, start the proxy with --providers-config /path/to/providers.{json,yaml}.
First-run auth depends on your providers:
- Copilot β
vekil loginuses Vekil-managed GitHub device-code sign-in; first proxy startup starts the same flow when needed. To use your current GitHub CLI account instead, opt in withvekil login --github-cli(or--gh).vekil logoutclears cached auth and disables future silentghreuse until you opt in again.COPILOT_GITHUB_TOKENremains the explicit non-interactive override. - OpenAI Codex β requires
codex loginso~/.codex/auth.jsonexists. In Docker, mount your Codex home intoCODEX_HOME(default/home/nonroot/.codex).
For full configuration and routing details, see Getting Started and Configuration.
Full documentation lives under docs/:
| Getting Started | Install, run, first auth |
| Configuration | Flags, env vars, provider routing |
| Client Examples | Copy-paste snippets per client |
| API Reference | Endpoint behavior and compatibility |
| Architecture | Package layout and design notes |
| Tray App | macOS/Linux menubar usage |
| Development | Build, test, benchmarks, CI |
Use any public model ID exposed by /v1/models β your client config is the same regardless of which provider owns the model upstream.
env ANTHROPIC_BASE_URL=http://localhost:1337 \
ANTHROPIC_API_KEY=dummy \
claude --model claude-sonnet-4 --print --output-format text "Reply with exactly PROXY_OK"env OPENAI_API_KEY=dummy \
OPENAI_BASE_URL=http://localhost:1337/v1 \
codex exec --skip-git-repo-check -m gpt-5.5 "Reply with exactly PROXY_OK"env GEMINI_API_KEY=dummy \
GOOGLE_GEMINI_BASE_URL=http://localhost:1337 \
GOOGLE_GENAI_API_VERSION=v1beta \
GEMINI_CLI_NO_RELAUNCH=true \
gemini -m gemini-2.5-pro -p "Reply with exactly PROXY_OK" -o json