Keep your coding agents running when you close the lid. Ship them to your team. Sell them to your customers. Humr gives Claude Code, Codex, Gemini CLI, or pi.dev an isolated Kubernetes pod, a credential-injecting proxy, a scheduler, and a Slack channel.
-
Isolated by design — Each agent runs in its own sandbox: separate pod, process, and filesystem. API keys live in a proxy; the agent never sees them. Network access is locked to destinations you've approved. That contains two of the three big agent-security risks structurally. Security model →
-
Always-on scheduling — Cron lives on the platform, not your laptop. Scheduled tasks look identical to human messages from the agent's perspective. Workspace and conversation history persist across restarts.
-
Built for team collaboration — Until now, coding agents have been 1:1: you and your copilot. Humr unlocks N people collaborating with 1 or many agents, multi-tenant from the ground up.
-
Slack-native channels — One Slack app, unlimited agents, per-thread routing. Your agents live where your team already works.
-
Bring your own agent — Claude Code and pi.dev ship as built-in templates. Codex, Gemini CLI, or anything that speaks ACP works too. No lock-in to one vendor's SDK or cloud.
git clone https://github.com/kagenti/humr && cd humrOpen your favorite AI coding agent in the repo and try:
Walk me through how Humr works step by step. I want to do a demo for myself.
Explain how things work on the way. Help me connect a model provider, create
an instance, add a connection to GitHub, and chat with an agent.
Once you're comfortable, go deeper:
Now show me the advanced stuff. Set up a Slack channel integration, create a
scheduled job, build a long-living agent with a heartbeat, and wire up an
MCP server.
Your agent has full context of the codebase, architecture decisions, and cluster commands.
Prerequisites: mise, a Docker-compatible runtime (Docker Desktop, Rancher Desktop, etc.), macOS or Linux.
mise install # install toolchain + deps
mise run cluster:install # create local k3s cluster + deploy HumrOpen humr.localhost:4444 (login: dev / dev), create an instance from a template, and start chatting. See the guide for cluster commands, credential setup, and Slack integration.
- Guide — credential setup, Slack integration, development workflow, architecture overview
- Security model — the three big risks when running AI agents, how Humr handles each, and what's still unsolved
- Why Humr exists — the three problems every agent hits in production, how Humr solves each, and a 5-minute walkthrough
