|
| 1 | +# qsp-mcp |
| 2 | + |
| 3 | +**QSP — relay MCP tools to any OpenAI-compatible local LLM endpoint.** |
| 4 | + |
| 5 | +Named after the Q-signal **QSP** ("Will you relay?"), qsp-mcp relays tool calls between a local LLM and MCP servers. Any model with function calling capability gains access to the full qso-graph tool ecosystem — zero cloud dependency. |
| 6 | + |
| 7 | +```bash |
| 8 | +pip install qsp-mcp |
| 9 | +``` |
| 10 | + |
| 11 | +[GitHub](https://github.com/qso-graph/qsp-mcp) · [PyPI](https://pypi.org/project/qsp-mcp/) |
| 12 | + |
| 13 | +--- |
| 14 | + |
| 15 | +## What It Does |
| 16 | + |
| 17 | +qsp-mcp is **not** an MCP server — it's an MCP **client** that bridges the gap between local LLM inference and MCP tools. It connects to your configured MCP servers, translates their tool definitions into OpenAI `tools` format, and manages the conversation loop with your local model. |
| 18 | + |
| 19 | +``` |
| 20 | +You ──> qsp-mcp ──> Local LLM (llama.cpp, Ollama, vLLM, SGLang) |
| 21 | + │ |
| 22 | + ▼ function call |
| 23 | + qsp-mcp |
| 24 | + │ |
| 25 | + ┌─────────┼─────────┐ |
| 26 | + ▼ ▼ ▼ |
| 27 | + solar-mcp pota-mcp ionis-mcp |
| 28 | + │ │ │ |
| 29 | + ▼ ▼ ▼ |
| 30 | + NOAA SWPC POTA API IONIS datasets |
| 31 | +``` |
| 32 | + |
| 33 | +--- |
| 34 | + |
| 35 | +## Quick Start |
| 36 | + |
| 37 | +### Interactive Mode |
| 38 | + |
| 39 | +```bash |
| 40 | +qsp-mcp --config ~/.config/qsp-mcp/config.json |
| 41 | +``` |
| 42 | + |
| 43 | +### Single Query |
| 44 | + |
| 45 | +```bash |
| 46 | +qsp-mcp --query "What bands are open from DN13 to JN48 right now?" |
| 47 | +``` |
| 48 | + |
| 49 | +### Direct Endpoint |
| 50 | + |
| 51 | +```bash |
| 52 | +qsp-mcp --endpoint http://localhost:8000/v1/chat/completions --api-key sk-xxx |
| 53 | +``` |
| 54 | + |
| 55 | +--- |
| 56 | + |
| 57 | +## Configuration |
| 58 | + |
| 59 | +The config format is **Claude Desktop compatible** — copy your existing `mcpServers` block directly. The `bridge` section is qsp-mcp specific. |
| 60 | + |
| 61 | +```json |
| 62 | +{ |
| 63 | + "mcpServers": { |
| 64 | + "solar": { |
| 65 | + "command": "solar-mcp" |
| 66 | + }, |
| 67 | + "pota": { |
| 68 | + "command": "pota-mcp" |
| 69 | + }, |
| 70 | + "wspr": { |
| 71 | + "command": "wspr-mcp" |
| 72 | + } |
| 73 | + }, |
| 74 | + "bridge": { |
| 75 | + "endpoint": "http://localhost:8000/v1/chat/completions", |
| 76 | + "api_key": "sk-xxx", |
| 77 | + "model": "your-model-name", |
| 78 | + "temperature": 0.3, |
| 79 | + "system_prompt": "You are an expert ham radio operator and RF engineer.", |
| 80 | + "max_tool_calls_per_turn": 5, |
| 81 | + "profiles": { |
| 82 | + "contest": { |
| 83 | + "servers": ["n1mm", "ionis", "solar", "wspr"], |
| 84 | + "temperature": 0.2, |
| 85 | + "system_prompt": "You are a contest advisor. Be concise." |
| 86 | + }, |
| 87 | + "propagation": { |
| 88 | + "servers": ["ionis", "solar", "wspr"], |
| 89 | + "temperature": 0.3 |
| 90 | + }, |
| 91 | + "full": { |
| 92 | + "servers": "*", |
| 93 | + "temperature": 0.3 |
| 94 | + } |
| 95 | + }, |
| 96 | + "server_timeouts": { |
| 97 | + "solar": 10, |
| 98 | + "qrz": 5 |
| 99 | + } |
| 100 | + } |
| 101 | +} |
| 102 | +``` |
| 103 | + |
| 104 | +Default config location: `~/.config/qsp-mcp/config.json` |
| 105 | + |
| 106 | +--- |
| 107 | + |
| 108 | +## CLI Options |
| 109 | + |
| 110 | +| Option | Description | |
| 111 | +|--------|-------------| |
| 112 | +| `-c, --config PATH` | Config file path | |
| 113 | +| `-e, --endpoint URL` | LLM endpoint URL (overrides config) | |
| 114 | +| `-k, --api-key KEY` | API key for the LLM endpoint | |
| 115 | +| `-m, --model NAME` | Model name (overrides config) | |
| 116 | +| `-p, --profile NAME` | Tool profile (contest, dx, propagation, full) | |
| 117 | +| `-q, --query TEXT` | Single query mode — ask one question and exit | |
| 118 | +| `--enable-writes` | Enable write-capable tools (disabled by default) | |
| 119 | +| `--list-tools` | List available tools and exit | |
| 120 | +| `--version` | Show version | |
| 121 | + |
| 122 | +### Interactive Commands |
| 123 | + |
| 124 | +| Command | Action | |
| 125 | +|---------|--------| |
| 126 | +| `/tools` | List available tools | |
| 127 | +| `/help` | Show help | |
| 128 | +| `quit` | Exit (also: `exit`, `q`, `73`) | |
| 129 | + |
| 130 | +--- |
| 131 | + |
| 132 | +## Profiles |
| 133 | + |
| 134 | +Profiles limit which servers and tools are available per session, reducing context size and improving tool selection accuracy on smaller models. |
| 135 | + |
| 136 | +| Profile | Servers | Use Case | |
| 137 | +|---------|---------|----------| |
| 138 | +| `contest` | n1mm, ionis, solar, wspr | Contest operation — band advice, conditions | |
| 139 | +| `dx` | ionis, solar, qrz, pota, sota, hamqth | DX hunting — lookups, spots, propagation | |
| 140 | +| `propagation` | ionis, solar, wspr | Propagation analysis — conditions, forecasts | |
| 141 | +| `full` | All servers | Everything available | |
| 142 | + |
| 143 | +Select a profile: `qsp-mcp --profile contest` |
| 144 | + |
| 145 | +--- |
| 146 | + |
| 147 | +## Supported LLM Endpoints |
| 148 | + |
| 149 | +Any endpoint implementing the OpenAI chat completions API with function calling: |
| 150 | + |
| 151 | +| Engine | Tested | Notes | |
| 152 | +|--------|--------|-------| |
| 153 | +| [llama.cpp](https://github.com/ggml-org/llama.cpp) | Yes | `--api-key` flag for auth | |
| 154 | +| [Ollama](https://ollama.ai) | Compatible | OpenAI-compatible endpoint | |
| 155 | +| [vLLM](https://github.com/vllm-project/vllm) | Compatible | Prefix caching recommended | |
| 156 | +| [SGLang](https://github.com/sgl-project/sglang) | Compatible | Prefix caching recommended | |
| 157 | + |
| 158 | +--- |
| 159 | + |
| 160 | +## Security |
| 161 | + |
| 162 | +- **Write protection**: Write-capable tools are disabled by default. Use `--enable-writes` to opt in per session. |
| 163 | +- **Credential isolation**: Credentials stay inside MCP servers (OS keyring). qsp-mcp never sees or handles credentials for external services. |
| 164 | +- **No subprocess**: No shell execution, no eval, no command injection surface. |
| 165 | +- **Audit log**: Every tool call is logged with timestamp, tool name, and result status. |
| 166 | + |
| 167 | +--- |
| 168 | + |
| 169 | +## Design Principles |
| 170 | + |
| 171 | +qsp-mcp is a **strict, stateless pipe**: |
| 172 | + |
| 173 | +- No caching, no shared state, no health polling |
| 174 | +- All state lives in MCP servers |
| 175 | +- All inference optimization lives in the inference server (prefix caching, KV-cache) |
| 176 | +- MCP servers handle their own degradation — qsp-mcp passes results blindly |
| 177 | +- Multiple qsp-mcp instances can point at a single inference server |
| 178 | + |
| 179 | +--- |
| 180 | + |
| 181 | +## Dependencies |
| 182 | + |
| 183 | +- `mcp>=1.0` — MCP client SDK |
| 184 | +- `httpx>=0.27` — HTTP client |
| 185 | +- Python 3.10+ |
| 186 | +- No torch, no numpy, no heavy dependencies |
0 commit comments