A lightweight desktop app for chatting with local LLMs, built with Tauri + React.
- 🪶 Lightweight - Uses Tauri (Rust) instead of Electron
- 💾 Persistence - Conversations saved in local SQLite
- 🔌 Compatible - Works with any OpenAI-compatible API (Ollama, LM Studio, etc.)
- 🖥️ Cross-platform - Mac, Linux and Windows
- 🤖 Model Manager - Create and manage custom Ollama models
# Install dependencies
npm install
# Run in development mode
npm run tauri dev
# Build for production
npm run tauri build- Open the app
- Click on "Settings" (⚙️)
- Configure the API URL:
- Ollama:
http://localhost:11434/v1 - LM Studio:
http://localhost:1234/v1 - OpenAI:
https://api.openai.com/v1
- Ollama:
- Select the model from the dropdown
- API Key is optional for local APIs
- Create a
.Modelfilein themodels/directory - Example (
models/my-model.Modelfile):FROM llama3.2 PARAMETER temperature 0.7 PARAMETER num_ctx 2048 SYSTEM """ You are a helpful AI assistant. """ - In the app, go to Settings > Manage Models
- Click "Create in Ollama" next to your model
llama3.2- Good balance of speed and qualityphi3- Fast and efficient modelmistral- Great for non-English languages
- Frontend: React + TypeScript + TailwindCSS
- Backend: Rust + Tauri 2.0
- Database: SQLite (via rusqlite)
- Icons: Lucide React
MIT