-
Notifications
You must be signed in to change notification settings - Fork 8.2k
Document Claude Code integration for Docker Model Runner #24243
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -258,6 +258,24 @@ print(response.text) | |
|
|
||
| You can find more details in [this Docker Blog post](https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/) | ||
|
|
||
| ## Claude Code | ||
|
|
||
| [Claude Code](https://claude.com/product/claude-code) is [Anthropic's](https://www.anthropic.com/) command-line tool for agentic coding. It lives in your terminal, understands your codebase, and executes routine tasks, explains complex code, and handles Git workflows through natural language commands. | ||
|
|
||
| ### Configuration | ||
|
|
||
| 1. Install Claude Code (see [docs](https://code.claude.com/docs/en/quickstart#step-1-install-claude-code)) | ||
| 2. Use the `ANTHROPIC_BASE_URL` environment variable to point Claude Code at DMR. On Mac or Linux, you can do this, for example if you want to use the `gpt-oss:32k` model: | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It might be worth adding a tip about making the env var persistent so users don't have to set it every time: Tip To avoid setting the variable each time, add it to your shell profile export ANTHROPIC_BASE_URL=http://localhost:12434 |
||
| ```bash | ||
| ANTHROPIC_BASE_URL=http://localhost:12434 claude --model gpt-oss:32k | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The Something like: Tip The default context size for $ docker model pull gpt-oss
$ docker model package --from ai/gpt-oss --context-size 32000 gpt-oss:32kAlternatively, models like |
||
| ``` | ||
| On Windows (PowerShell) you can do it like this: | ||
| ```powershell | ||
| $env:ANTHROPIC_BASE_URL="http://localhost:12434" | ||
| claude --model gpt-oss:32k | ||
| ``` | ||
| You can find more details in [this Docker Blog post](https://www.docker.com/blog/run-claude-code-locally-docker-model-runner/) | ||
|
|
||
| ## Common issues | ||
|
|
||
| ### "Connection refused" errors | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since every other tool on this page uses the OpenAI-compatible endpoint, it might be worth a one-liner noting that DMR also exposes an Anthropic-compatible API, which is why this integration works differently. Otherwise readers might wonder why this section doesn't use
/engines/v1like everything else.