This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
OpenMind is a Tauri 2 desktop application providing an AI chat interface. It connects a React 19 frontend to local AI models served by Ollama at http://10.0.0.155:18080.
# Development (Tauri window with hot reload)
pnpm tauri dev
# Frontend only (no native window)
pnpm dev
# Production build
pnpm tauri build
# Verify Ollama/OpenCode integration
./diagnostics/test-opencode-ollama.sh
# Sync available Ollama models to OpenCode config
./diagnostics/sync-ollama-models.sh
# LAN discovery + connectivity verifier + curl command map
./diagnostics/curlllama.shThere are no automated tests. The shell scripts serve as integration tests.
The project follows standard Tauri 2 architecture:
src/— React/TypeScript frontend (rendered in a WebView)src-tauri/— Rust backend exposing Tauri commands to the frontend
App.tsx— Root component. Manages: streaming chat, model selector, localStorage history, 15 s server polling, diagnostic banner, clear button.lib/opencode-client.ts— All Ollama HTTP calls:sendMessageStream()— streams tokens viaPOST /api/chatwith full conversation history; usesReadableStreamgetServerStatus()— health probe with 5 s timeout, returns latency + error detailgetAvailableModels()— fetches model list fromGET /api/tags
components/— Reusable UI components (Layout,Button,Panel,QuitButton), each with a co-located.cssfile, exported viacomponents/index.ts.
| State | Purpose |
|---|---|
messages |
Conversation history; persisted to localStorage under key openmind-messages |
selectedModel |
Active model; synced with model selector dropdown |
serverStatus |
ServerStatus from last health probe |
models |
Model list fetched from Ollama when connected |
Rust commands exposed via invoke() from @tauri-apps/api/core:
shutdown_app— gracefully quitsgreet— test command
- Vite dev server: port
1420(required by Tauri) - Tauri
productName:openmind, window title:OpenMind - Tauri app identifier:
boardroom.pythai.net - TypeScript: strict mode, ES2020 target
- Ollama server:
http://10.0.0.155:18080(configured indefaultConfiginopencode-client.ts)