Skip to content

Improve global + per-instance LLM UX and secret handling#161

Open
bussyjd wants to merge 1 commit intointegration-okr-1from
codex/llm-global-granular-ux
Open

Improve global + per-instance LLM UX and secret handling#161
bussyjd wants to merge 1 commit intointegration-okr-1from
codex/llm-global-granular-ux

Conversation

@bussyjd
Copy link
Collaborator

@bussyjd bussyjd commented Feb 14, 2026

Summary

  • add obol llm status to inspect global llmspy provider state and key presence
  • make OpenClaw cloud setup fail fast when global llmspy configuration fails
  • add per-instance direct/custom provider override paths in interactive OpenClaw setup
  • stop writing literal provider/channel secrets into values-obol.yaml
  • persist instance secret material in values-obol.secrets.json (0600) and sync to a namespace secret on openclaw sync

Details

  • new interactive options for OpenClaw setup:
    • global routed via llmspy (Ollama/OpenAI/Anthropic)
    • direct OpenAI or Anthropic instance override
    • custom OpenAI-compatible endpoint instance override
  • overlays now use env-var wiring for credentials; secret values are injected through secrets.extraEnvFromSecrets
  • sync flow now ensures namespace exists and applies openclaw-user-secrets before helmfile sync
  • import parser now detects/propagates env-var key references and defaults provider env var names

Validation

  • go test ./...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant