Ollama #6
Replies: 6 comments
-
|
hey thanks! glad you're trying it out. ollama isnt bundled in the container right now but you can connect to it if you're running it on your host or another container on the same network. just set the OLLAMA_HOST environment variable in your compose file to point to wherever ollama is running (e.g. some of the bundled CLIs like taskmaster can use ollama models if you configure them to point at your ollama endpoint. adding native ollama support inside the container is something i could look into for a future release. ill leave this open as a feature request. let me know how the install goes this weekend! |
Beta Was this translation helpful? Give feedback.
-
|
Hello, Your Holy Claude project is great! I'm trying to use it with Ollama, but I'm having trouble. I tried modifying the Docker Compose file by adding:
This doesn't work in the web interface; I get this error: ❯ /model qwen3.5:cloud Unable to validate model: Could not resolve authentication method. Expected either apiKey or authToken to be set. Or for one of the "X-Api-Key" or "Authorization" headers to be explicitly omitted However, if I go into the container with: docker exec -it holyclaude bash and execute 'claude' (therefore in the container), I can use Ollama just fine \o/ And I also tried setting it up as you suggested:
There, I don't even have 'claude' working in the container /o\ Could you please help us get the Holy Claude web interface working with Ollama? We're perfectly happy to modify docker-compose with environment variables; we don't need native Ollama mode, but for now, Holy Claude web doesn't take it into account! Thanks again for your wonderful work! |
Beta Was this translation helpful? Give feedback.
-
|
couple things here. first, we did find that the s6 service script that launches CloudCLI strips environment variables. Docker Compose env vars exist in the container but never reach the CloudCLI process. fixing that in the next release so env vars like second, on the ollama side specifically. Claude Code uses the Anthropic SDK which speaks the Anthropic Messages API format. Ollama exposes an OpenAI-compatible API. these are two different formats. even if the env vars passed through correctly, pointing you mentioned it works when you the env var passthrough fix is coming either way so things like Bedrock and Vertex will work through the web UI. but for the Ollama use case we need to understand your routing before we can say more. |
Beta Was this translation helpful? Give feedback.
-
|
Hello, Thanks for replying so quickly! After analyzing your Dockerfile, I understood that it was the s6 service script that starts CloudCLI that wasn't retrieving the Docker environment variables. That's fixed now. It's true that Ollama exposes, by default, an API compatible with OpenAI that isn't compatible with Anthropic's, but since the beginning of the year, Ollama is also compatible with Anthropic \o/ https://ollama.com/blog/claude That's why, when I started Claude inside the Docker container, it worked fine with Ollama. So now, thanks to your fix for passing variables from docker-compose to CloudCLI, Holy Claude can now work without needing a Claude Pro or Max subscription. For those who don't have an Ollama server with a GPU but still want to try Holy Claude with Ollama, there's the free solution of Ollama Cloud! You just need to install the Ollama application on your computer, log in to Ollama with an ollama login, and then modify the docker-compose.yaml file by adding the two environment variables ANTHOPIC_* with the IP address of the computer where the Ollama application is running: ...
environment:
--- Required ---
Once Holy Claude has started, in the Claude window, simply choose an Ollama Cloud template, for example: /model qwen3.5:cloud If you have already used Holy Claude with a subscription Anthropic, before anything else, it's best to start from scratch by completely deleting the data folder! Would it be a good idea to add this information to your excellent documentation? By the way (I don't want to create an issue since it's specific to this context), when using Holy Claude with Ollama, and therefore without 'login' to Claude, the 'WEB Terminal' button in the top right corner is no longer accessible. This is unfortunate because it would be nice to have easier access to the Holy Claude Docker container terminal instead of prefixing each command with a '!'. Thanks again for your fantastic work! |
Beta Was this translation helpful? Give feedback.
-
|
Great find on the Ollama Anthropic API support. Didn't know they added that. Good to hear the env var fix in Will look into adding Ollama setup instructions to the docs since it's a legit use case now. On the Web Terminal button disappearing without a Claude login, that's a CloudCLI UI issue where it gates certain features behind authentication. Will look into it. |
Beta Was this translation helpful? Give feedback.
-
|
Hello, Thank you so much for all the corrections! I noticed that when I use Ollama without being logged into Claude, I just need to refresh the browser page to get the 'Terminal' icon in the top right corner. And it works perfectly. Thank you also for the excellent documentation on Ollama that you wrote! Best regards |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey
Thank you for your great work, I'll install it this week end :)
Is there a Way to play with ollama on this config ?
Beta Was this translation helpful? Give feedback.
All reactions