Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,6 @@ typescript/*.js.map

# TypeDoc generated documentation
typescript/docs/

# JetBrains IDE
.idea
1 change: 1 addition & 0 deletions docs/get-started/agents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ The following agents can be used with an ACP Client:
- [Claude Agent](https://platform.claude.com/docs/en/agent-sdk/overview) (via [Zed's SDK adapter](https://github.com/zed-industries/claude-agent-acp))
- [Cline](https://cline.bot/)
- [Codex CLI](https://developers.openai.com/codex/cli) (via [Zed's adapter](https://github.com/zed-industries/codex-acp))
- [Open AI Agent](https://developers.openai.com/apps-sdk/quickstart) (via [stdio Bus worker](https://github.com/stdiobus/workers-registry/tree/main/workers-registry/openai-agent))
- [Code Assistant](https://github.com/stippi/code-assistant?tab=readme-ov-file#configuration)
- [Cursor](https://cursor.com/docs/cli/acp)
- [Docker's cagent](https://github.com/docker/cagent)
Expand Down
1 change: 1 addition & 0 deletions docs/get-started/clients.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -79,3 +79,4 @@ These connectors bridge ACP into other environments and transport layers:

- [Aptove Bridge](https://github.com/aptove/bridge) — bridges stdio-based ACP agents to the Aptove mobile client over WebSocket
- [OpenClaw](https://docs.openclaw.ai/cli/acp) — through the [`openclaw acp`](https://docs.openclaw.ai/cli/acp) bridge to an OpenClaw Gateway
- [stdio Bus](https://stdiobus.com) – deterministic stdio-based kernel providing transport-level routing for ACP/MCP-style agent protocols.
25 changes: 25 additions & 0 deletions docs/get-started/registry.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1339,6 +1339,31 @@ Visit [the registry repository on GitHub](https://github.com/agentclientprotocol
<code>0.3.67</code>
</p>
</Card>
<Card
title="stdio Bus"
href="https://github.com/stdiobus/stdiobus"
icon={
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 594.46 144.18">
<g id="Layer_1-2" data-name="Layer 1">
<path style={{fill: '#fff'}} d="M194.62,137.45c-11.94-6.43-19.77-17.16-23.22-29.87-2.21-8.16-2.37-16.32-1.22-24.65,2.35-17.07,13.12-31.21,28.95-37.96,10.79-4.27,22.35-5.12,33.61-2.28,10.62,2.67,18.71,8.95,25.39,18.02l.05-60.37,13.62-.02v142.39s-13.08.12-13.08.12l-.13-19.68c-13.26,21.72-42.7,25.77-63.97,14.32ZM255.34,108.6c4.12-10.59,4.06-22.73-.19-33.2-5.64-13.89-18.4-21.85-33.11-22.36-17.11-.59-32.28,9.58-37.04,26.27-5.25,18.39.84,39.98,18.9,48.58,18.84,8.98,43.22,1.83,51.44-19.29Z"/>
<path style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} d="M451.92,92.33c0,28.57-23.16,51.73-51.73,51.73s-51.73-23.16-51.73-51.73,23.16-51.73,51.73-51.73,51.73,23.16,51.73,51.73ZM435.96,105.09c2.41-8.46,2.35-17.34.03-25.65-4.46-15.98-18.16-25.88-34.43-26.44-18.78-.65-34.85,11.6-38.13,30.36-1.56,8.91-1.08,18.09,2.65,26.41,3.96,8.87,10.71,15.79,19.86,19.24,20.33,7.68,43.63-1.51,50.03-23.93Z"/>
<path style={{fill: '#fff'}} d="M82.69,112.79c1.04,17.11-11.7,26.82-27.04,29.57s-30.48,1.27-44.56-4.65c-4.08-1.72-7.49-3.71-10.92-6.52l5.97-10.64c14.88,10.19,33.21,13.88,50.65,9.58,6.41-1.58,11.38-6.48,12.27-12.72,1.55-10.93-8.08-14.92-17-16.71l-23.38-4.68c-5.55-1.11-10.9-3.29-15.42-6.36-5.98-4.06-8.91-10.03-9.47-16.99-1.36-17,11.2-27.52,26.7-30.44,15.35-2.89,35.81-.65,48.87,8.31l-6,10.77c-5.05-3.48-10.65-5.91-16.85-7.09-10.19-1.94-24.32-2.78-33.1,3.52-6.5,4.67-8.51,14.39-2.77,20.45,3.52,3.71,8.66,5.46,13.62,6.48l23.68,4.87c5.09,1.05,9.93,3.03,14.3,5.55,6.6,3.82,10,10.21,10.46,17.7Z"/>
<path style={{fill: '#fff'}} d="M154.96,127.01l4.74,9.68c-8.4,7.66-24.46,8.53-34.42,4.8-11.43-4.28-16.34-14.97-16.34-26.76l-.02-61.41-17.98-.06v-11.34s17.95-.02,17.95-.02l.07-22.11h13.59s0,22.1,0,22.1l30.72.04.03,11.34-30.78.02.05,60.95c0,5.01,1.47,9.73,4.75,13.24,6.36,6.8,20.8,5.52,27.63-.47Z"/>
<rect style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} x="308.54" y="42.03" width="13.62" height="100.76"/>
<path style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} d="M520.13,130.54c-.07,9.38-8.53,12.22-16.08,12.23h-22.42s.01-44.78.01-44.78h20.71c2.58,0,5.3.59,7.74,1.21,4.36,1.66,7.26,4.88,7.65,9.26.39,4.43-1.3,8.54-5.95,11.07,5,1.7,8.38,5.39,8.33,11.01ZM509.71,114.77c2.12-2.17,2.08-5.36.84-7.77-2.99-5.76-16.13-3.24-22.53-3.8v14.27s14.57-.04,14.57-.04c2.42,0,5.39-.88,7.12-2.66ZM511.66,135.11c2.89-2.58,2.72-7.51,0-9.95-1.75-1.57-4.67-2.39-7.07-2.4l-16.58-.08v14.88s16.16-.03,16.16-.03c2.45,0,5.51-.66,7.48-2.42Z"/>
<path style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} d="M559.61,108.94v33.8s-5.71.05-5.71.05l-.4-4.6c-3.76,4.25-8.63,5.36-14.01,4.73-6.55-.77-11.63-5.52-11.67-12.4l-.14-21.56h6.11s.13,20.28.13,20.28c.04,5.33,4.15,8.47,9.07,8.43,5.77-.05,10.33-3.99,10.38-10.05l.15-18.66,6.09-.02Z"/>
<path style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} d="M588.6,141.37c-7.45,3.05-15.8,2-22.83-1.99l2.52-4.96c4.55,3.4,19.51,5.94,19.93-.89.1-1.62-1.25-3.41-3.07-3.79l-10.82-2.29c-2.03-.43-4.33-1.71-5.69-3.27-1.98-2.28-2.27-5.59-1.24-8.67,2.84-8.55,19.05-8.28,25.82-3.95l-2.45,4.79c-4.4-2.97-17.87-4.46-17.83,2.2,0,1.4,1.39,3.28,3.13,3.66l11.05,2.46c4.13.92,6.97,3.74,7.22,7.7.22,3.64-1.6,7.31-5.71,9Z"/>
<circle style={{ stroke: "#000", strokeMiterlimit: 10, strokeWidth: ".25px", fill: "#ffffff" }} cx="315.49" cy="9.98" r="9.86"/>
</g>
</svg>
}
>
Deterministic stdio-based kernel in C providing transport-level routing for
ACP/MCP-style agent protocols
<p class="text-xs mt-3">
<code>2.0.3</code>
</p>
</Card>
</CardGroup>

## Using the Registry
Expand Down
303 changes: 303 additions & 0 deletions docs/rfds/stdio-bus-agents-transport.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,303 @@
---
title: "Add stdio Bus Transport Layer and Universal OpenAI Agent"
---

Author(s): [Raman Marozau](https://github.com/morozow)

## Elevator pitch

> What are you proposing to change?

Introduce stdio Bus — a deterministic stdio-based kernel written in C — as a new transport layer for the ACP ecosystem, along with its flagship OpenAI Agent worker that bridges ACP to any OpenAI Chat Completions API-compatible endpoint.

This addition provides:
- A low-level, deterministic routing kernel for stdio-based ACP/MCP protocols
- A universal agent that works with OpenAI, AWS Bedrock, Azure OpenAI, Ollama, vLLM, LiteLLM, and any other Chat Completions API-compatible provider
- A reference implementation demonstrating how to build production-grade ACP agents with proper session management, streaming, and cancellation

## Status quo

> How do things work today and what problems does this cause? Why would we change things?

Currently, ACP agents and clients communicate through various transport mechanisms (stdio, WebSocket, HTTP). However:

1. **No standardized low-level routing kernel**: Each agent implementation handles its own stdio communication, leading to duplicated effort and inconsistent behavior across implementations.

2. **Fragmented LLM provider support**: Developers who want to connect ACP clients to different LLM providers (OpenAI, Bedrock, Azure, local models) must either:
- Use provider-specific agents (if they exist)
- Build custom bridging solutions from scratch
- Maintain multiple agent configurations for different providers

3. **Missing reference implementation**: The ecosystem lacks a well-documented, production-tested example of how to build an ACP agent with proper:
- Multi-turn conversation state management
- SSE streaming with cancellation support
- Graceful shutdown handling
- Comprehensive error handling for various HTTP failure modes

4. **Local model friction**: Running local models (Ollama, vLLM) with ACP clients requires custom integration work, limiting adoption for developers who want to experiment without API costs.

## What we propose to do about it

> What are you proposing to improve the situation?

### 1. Add stdio Bus kernel to the Registry

stdio Bus is a deterministic stdio-based kernel written in C that provides transport-level routing for ACP/MCP-style agent protocols. It acts as a process supervisor and message router, handling:

- Agent process lifecycle management
- NDJSON message framing and routing
- Session ID mapping between clients and agents
- Pool-based agent instance management

### 2. Add OpenAI Agent to the Agents list

The OpenAI Agent is a TypeScript worker that translates ACP protocol messages into OpenAI Chat Completions API calls. It supports any endpoint implementing the Chat Completions API:

| Provider | Base URL |
|----------|----------|
| OpenAI | `https://api.openai.com/v1` (default) |
| AWS Bedrock | `https://{region}.bedrock.amazonaws.com/openai/v1` |
| Azure OpenAI | `https://{resource}.openai.azure.com/openai/deployments/{deployment}` |
| Ollama | `http://localhost:11434/v1` |
| vLLM | `http://localhost:8000/v1` |
| LiteLLM | `http://localhost:4000/v1` |

### 3. Add stdio Bus to Connectors

stdio Bus serves as a transport-level connector, bridging stdio-based agents to various client environments through its TCP interface and pool management.

## Shiny future

> How will things will play out once this feature exists?

Once stdio Bus and the OpenAI Agent are part of the ACP ecosystem:

1. **Unified LLM access**: Developers can connect any ACP client (Zed, JetBrains, neovim, etc.) to any OpenAI-compatible API through a single, well-tested agent. Switching from OpenAI to Ollama or Bedrock requires only changing environment variables.

2. **Local development simplified**: Running `ollama serve` + stdio Bus gives developers a fully local ACP setup with no API keys or cloud dependencies.

3. **Production-ready reference**: The OpenAI Agent serves as a reference implementation for building ACP agents, demonstrating:
- Proper ACP protocol handling (initialize, session/new, session/prompt, cancel)
- Streaming response delivery via SSE
- Per-session conversation history
- Graceful cancellation with AbortController
- Comprehensive error handling and logging

4. **Infrastructure foundation**: stdio Bus provides a stable foundation for building more complex agent topologies — multiple agents, load balancing, failover — without modifying individual agent implementations.

## Implementation details and plan

> Tell me more about your implementation. What is your detailed implementation plan?

### Architecture

```mermaid
flowchart TB
subgraph kernel["stdio Bus Kernel (C)"]
direction TB
K1[Process Supervisor]
K2[NDJSON Router]
K3[Pool Manager]
K1 --> K2
K2 --> K3
end

subgraph agent["OpenAI Agent (TypeScript)"]
direction TB

subgraph transport["Transport Layer"]
SR[SessionIdRouter]
ASC[AgentSideConnection<br/>ACP JSON-RPC 2.0]
SR --> ASC
end

subgraph protocol["Protocol Layer"]
OA[OpenAIAgent]

subgraph handlers["Method Handlers"]
H1[initialize]
H2[session/new]
H3[session/prompt]
H4[cancel]
end

OA --> handlers
end

subgraph state["State Management"]
SM[SessionManager]
S1[(Session 1<br/>history + AbortController)]
S2[(Session 2<br/>history + AbortController)]
S3[(Session N<br/>history + AbortController)]
SM --> S1
SM --> S2
SM --> S3
end

subgraph http["HTTP Client Layer"]
CCC[ChatCompletionsClient]
SSE[SSEParser<br/>line-by-line]
CCC --> SSE
end

transport --> protocol
protocol --> state
protocol --> http
end

subgraph providers["OpenAI-Compatible Providers"]
direction LR
P1[OpenAI<br/>api.openai.com]
P2[AWS Bedrock<br/>bedrock.amazonaws.com]
P3[Azure OpenAI<br/>openai.azure.com]
P4[Ollama<br/>localhost:11434]
P5[vLLM<br/>localhost:8000]
P6[LiteLLM<br/>localhost:4000]
end

Client([ACP Client<br/>Zed / JetBrains / neovim]) <-->|TCP| kernel
kernel <-->|"stdin/stdout<br/>NDJSON"| agent
http -->|"POST /chat/completions<br/>stream: true"| providers
providers -->|"SSE<br/>delta.content"| http

style kernel fill:#1a1a2e,stroke:#16213e,color:#eee
style agent fill:#0f3460,stroke:#16213e,color:#eee
style transport fill:#1a1a4e,stroke:#16213e,color:#eee
style protocol fill:#1a1a4e,stroke:#16213e,color:#eee
style state fill:#1a1a4e,stroke:#16213e,color:#eee
style http fill:#1a1a4e,stroke:#16213e,color:#eee
style handlers fill:#252550,stroke:#16213e,color:#eee
style providers fill:#533483,stroke:#16213e,color:#eee

```

### Message Flow

1. Registry Launcher spawns `openai-agent` as a child process with environment variables
2. stdin receives NDJSON messages from stdio Bus kernel
3. SessionIdRouter strips sessionId from incoming messages, restores it on outgoing
4. AgentSideConnection + ndJsonStream handle JSON-RPC 2.0 framing
5. OpenAIAgent dispatches to the appropriate handler (initialize, newSession, prompt, cancel)
6. On prompt: ACP content blocks are converted to OpenAI messages, ChatCompletionsClient sends POST `{baseUrl}/chat/completions` with `stream: true`
7. SSE chunks are parsed line-by-line; `delta.content` tokens are forwarded via `sessionUpdate()`
8. On stream completion (`data: [DONE]`), the full response is saved to session history

### ACP Protocol Support

| Method | Description | Status |
|--------|-------------|--------|
| `initialize` | Returns agent name, version, capabilities | ✅ Implemented |
| `session/new` | Creates session with unique ID and empty history | ✅ Implemented |
| `session/load` | Not supported (returns error) | ✅ Implemented |
| `authenticate` | No-op (returns void) | ✅ Implemented |
| `session/prompt` | Converts content → OpenAI messages, streams response | ✅ Implemented |
| `cancel` | Aborts in-flight HTTP request via AbortController | ✅ Implemented |

### Agent Capabilities

```json
{
"protocolVersion": "2025-03-26",
"agentInfo": { "name": "openai-agent", "version": "1.0.0" },
"agentCapabilities": {
"promptCapabilities": { "embeddedContext": true }
},
"authMethods": []
}
```

### Content Block Conversion

| ACP Block Type | OpenAI Conversion |
|----------------|-------------------|
| `text` | Text content directly |
| `resource_link` | `[Resource: {name}] {uri}` |
| `resource` | `[Resource: {uri}]\n{text}` |
| `image` | `[Image: {mimeType}]` |

### Configuration

All configuration via environment variables:

| Variable | Default | Description |
|----------|---------|-------------|
| `OPENAI_BASE_URL` | `https://api.openai.com/v1` | Base URL of the Chat Completions API endpoint |
| `OPENAI_API_KEY` | `''` (empty) | API key for authentication |
| `OPENAI_MODEL` | `gpt-4o` | Model identifier |
| `OPENAI_SYSTEM_PROMPT` | (unset) | Optional system prompt prepended to every conversation |
| `OPENAI_MAX_TOKENS` | (unset) | Optional max tokens limit |
| `OPENAI_TEMPERATURE` | (unset) | Optional temperature (float) |

### Error Handling

All errors are delivered as `agent_message_chunk` session updates followed by `{ stopReason: 'end_turn' }`:

| Condition | Error Message Pattern |
|-----------|----------------------|
| HTTP 401/403 | `Authentication error (HTTP {status}) calling {url}. Check your OPENAI_API_KEY.` |
| HTTP 429 | `Rate limit exceeded (HTTP 429) calling {url}. Please retry later.` |
| HTTP 500+ | `Server error (HTTP {status}) from {url}.` |
| Network failure | `Network error connecting to {url}: {message}` |
| Invalid SSE JSON | Logged to stderr, chunk skipped, stream continues |
| Unknown sessionId | JSON-RPC error response via ACP SDK |

### Key Design Decisions

- **Zero HTTP dependencies**: Uses native `fetch()` (Node.js 20+) instead of axios/node-fetch
- **Stateless SSE parser**: `parseLine()` is a pure function — no buffering state, easy to test
- **Per-session AbortController**: Each prompt gets a fresh AbortController via `resetCancellation()`
- **Partial responses discarded on cancel**: Incomplete assistant responses are not saved to history
- **All logging to stderr**: stdout is reserved exclusively for NDJSON protocol messages

### Test Coverage

The test suite includes 11 test files covering 5 unit test suites and 6 property-based test suites:

**Unit tests:**
- `config.test.ts` — default values, env var reading, numeric parsing
- `session.test.ts` — session creation, history management, cancellation lifecycle
- `sse-parser.test.ts` — SSE line parsing (data, done, skip, comments, invalid JSON)
- `agent.test.ts` — initialize, newSession, loadSession, authenticate, prompt
- `client.test.ts` — HTTP error classification, network errors, stream completion, cancellation

**Property-based tests (fast-check, 100+ iterations each):**
- `config.property.test.ts` — configuration round-trip, numeric env var parsing
- `session.property.test.ts` — session uniqueness, history order preservation
- `sse-parser.property.test.ts` — SSE line classification, content round-trip
- `conversion.property.test.ts` — content block conversion, request construction
- `error-handling.property.test.ts` — HTTP error classification across status code ranges
- `agent.property.test.ts` — initialize response field validation

### Documentation Changes

1. **Registry** (`docs/get-started/registry.mdx`): Add stdio Bus card with SVG logo and version `2.0.3`
2. **Clients** (`docs/get-started/clients.mdx`): Add stdio Bus to Connectors section
3. **Agents** (`docs/get-started/agents.mdx`): Add OpenAI Agent via stdio Bus worker adapter

## Frequently asked questions

> What questions have arisen over the course of authoring this document or during subsequent discussions?

### Why a separate kernel instead of integrating routing into agents?

Separation of concerns. The kernel handles process management, message routing, and transport — concerns that are orthogonal to agent logic. This allows agents to focus purely on protocol handling and LLM interaction, making them simpler to develop and test.

### Why support so many LLM providers through one agent?

The OpenAI Chat Completions API has become a de facto standard. Most providers (including Anthropic via proxies, AWS Bedrock, Azure, and local inference servers) implement this API. A single universal agent reduces maintenance burden and provides consistent behavior across providers.

### What about providers that don't support the Chat Completions API?

Additional workers can be added to the workers-registry for providers with different APIs. The architecture supports multiple agent types running under the same stdio Bus kernel.

### How does this compare to existing ACP agents like Claude Agent or Codex CLI?

Those agents are provider-specific and often include additional features (tool use, code execution). The OpenAI Agent is intentionally minimal — it's a reference implementation and a universal bridge, not a full-featured coding assistant.

### What about authentication?

The agent currently uses API key authentication via environment variables. For production deployments with multiple users, the stdio Bus kernel can be extended to support per-request authentication headers.

## Revision history

- Initial RFD creation with stdio Bus v2.0.3 and OpenAI Agent v1.4.12