Skip to content

Commit 6c60aae

Browse files
authored
Merge pull request #3221 from ably/feat/add-ai-transport-mentions
docs: add AI Transport cross-references to key pages
2 parents d80e402 + 15337a8 commit 6c60aae

7 files changed

Lines changed: 32 additions & 6 deletions

File tree

data/onPostBuild/llmstxt.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ const LLMS_TXT_PREAMBLE = `# Ably Documentation
1313
1414
- **Global Edge Network**: Ultra-low latency realtime messaging delivered through a globally distributed edge network
1515
- **Enterprise Scale**: Built to handle millions of concurrent connections with guaranteed message delivery
16-
- **Multiple Products**: Pub/Sub, Chat, LiveSync, LiveObjects and Spaces
16+
- **Multiple Products**: Pub/Sub, AI Transport, Chat, LiveSync, LiveObjects and Spaces
1717
- **Developer-Friendly SDKs**: SDKs available for JavaScript, Node.js, Java, Python, Go, Objective-C, Swift, Csharp, PHP, Flutter, Ruby, React, React Native, and Kotlin
1818
1919
`;

src/pages/docs/platform/ai-llms/index.mdx

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,10 @@ meta_keywords: "Ably LLM, AI documentation, CLAUDE.md, AGENTS.md, Cursor rules,
66

77
Ably documentation is designed to be LLM-friendly, making it easy to use AI assistants like Claude, ChatGPT, or Cursor to help you build realtime applications.
88

9+
<Aside data-type='note'>
10+
If you're building AI-powered applications, see the [AI Transport](/docs/ai-transport) documentation for purpose-built infrastructure covering token streaming, session management, and human-in-the-loop workflows.
11+
</Aside>
12+
913
## Available resources
1014

1115
Ably provides two key resources optimized for LLM consumption:
@@ -126,11 +130,11 @@ Use **product and feature docs** for:
126130
When a product-specific abstraction exists, **always prefer its documentation over generic Pub/Sub docs**, even if the underlying concepts overlap.
127131
128132
Priority order:
129-
1. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation
130-
2. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI
131-
3. **LiveObjects docs** – for realtime shared state (maps, counters)
132-
4. **LiveSync docs** – for database-to-client synchronization
133-
5. **AI Transport docs** – for AI token streaming, durable sessions, bi-directional AI communication
133+
1. **AI Transport docs** – for AI/LLM token streaming, sessions, human-in-the-loop, tool calls, multi-device
134+
2. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation
135+
3. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI
136+
4. **LiveObjects docs** – for realtime shared state (maps, counters)
137+
5. **LiveSync docs** – for database-to-client synchronization
134138
6. **Generic Pub/Sub docs** – only when no higher-level abstraction applies
135139
136140
Do not rebuild Chat, Spaces, LiveObjects, or AI Transport behavior directly on raw channels unless explicitly requested.
@@ -183,6 +187,7 @@ Use placeholders for secrets:
183187
184188
Explicitly choose the correct interface and explain why:
185189
190+
- **AI Transport**: realtime AI/LLM token streaming, resumable sessions, human-in-the-loop, multi-device continuity
186191
- **Realtime SDK**: realtime pub/sub, presence, collaboration
187192
- **REST SDK**: server-side publishing, token creation, history, stats
188193
- **Chat SDK**: structured chat features and moderation

src/pages/docs/platform/ai-llms/llms-txt.mdx

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,9 @@ Core platform documentation including account management, architecture, pricing,
3232
### Pub/Sub
3333
Documentation for Ably's core realtime messaging capabilities: channels, messages, presence, authentication, connections, and protocols.
3434

35+
### AI Transport
36+
Documentation for Ably's AI Transport product covering token streaming, sessions and identity, messaging features such as human-in-the-loop and tool calls, and getting started guides for OpenAI, Anthropic, Vercel AI SDK, and LangGraph.
37+
3538
### Chat
3639
The Ably Chat product documentation covering rooms, messages, reactions, typing indicators, and moderation features.
3740

src/pages/docs/platform/architecture/connection-recovery.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ Ably minimizes the impact of these disruptions by providing an effective recover
1313

1414
Applications built with Ably will continue to function normally during disruptions. They will maintain their state and all messages will be received by the client in the correct order. This is particularly important for applications where messages delivery guarantees are crucial, such as in applications where client state is hydrated and maintained incrementally by messages.
1515

16+
Connection recovery is especially important for AI applications, where a network interruption during token streaming can disrupt the user experience. Ably [AI Transport](/docs/ai-transport) builds on this mechanism to enable [resumable token streaming](/docs/ai-transport/token-streaming) from language models, ensuring users can reconnect mid-stream and continue from where they left off.
17+
1618
Ably achieves a reliable connection recovery mechanism with the following:
1719

1820
* [Connection states](#connection-states)

src/pages/docs/platform/index.mdx

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,14 @@ Use Ably [LiveSync](/docs/livesync) to synchronize changes between your database
115115

116116
LiveSync automatically streams changes you make in your database to clients to keep them in sync with the source of truth in your database.
117117

118+
### Ably AI Transport <a id="ai-transport"/>
119+
120+
Use Ably [AI Transport](/docs/ai-transport) as a drop-in infrastructure layer that upgrades your AI streams into bi-directional, stateful experiences. It provides resumable token streaming, multi-device continuity, human-in-the-loop workflows, and session management that works with any AI model or framework.
121+
122+
AI Transport is built on Ably Pub/Sub. It utilizes Ably's platform to benefit from all of the same performance guarantees and scaling potential.
123+
124+
AI Transport is effective for use cases such as multi-turn conversational AI applications, AI agent coordination, live steering with human takeover, and any scenario where reliable LLM token delivery and session resumability are critical.
125+
118126
## Next steps
119127

120128
* For guidance on choosing between Ably's interfaces and products, see [Product guidance](/docs/platform/products).

src/pages/docs/protocols/sse.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,10 @@ SSE is an excellent alternative to Ably SDK in memory-limited environments.
3838
* Access to a comprehensive range of features including, but not limited to, [publishing](/docs/push/publish), [presence](/docs/presence-occupancy/presence), [history](/docs/storage-history/history), [push notifications](/docs/push), [automatic payload encoding](/docs/channels/options/encryption), and [symmetric encryption](/docs/channels/options/encryption).
3939
* Optimal compatibility with browsers via the WebSocket protocol.
4040

41+
<Aside data-type="note">
42+
If you are streaming LLM responses to users, [AI Transport](/docs/ai-transport) provides purpose-built token streaming with resumable sessions and multi-device continuity, addressing the limitations of HTTP-based streaming.
43+
</Aside>
44+
4145
## Configuration <a id="config"/>
4246

4347
The following code sample provides an example of how to use SSE with Ably:

src/pages/docs/pub-sub/index.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -438,6 +438,10 @@ if err := channel.Publish(context.Background(), "example", "message data"); err
438438
</Code>
439439

440440

441+
<Aside data-type="note">
442+
If you are publishing messages to stream LLM responses to users, [AI Transport](/docs/ai-transport) provides purpose-built token streaming with resumable sessions, multi-device continuity, and human-in-the-loop workflows on top of pub-sub.
443+
</Aside>
444+
441445
<Aside data-type="further-reading">
442446
You can find out more detail about how [channels](/docs/channels) and [messages](/docs/messages) work.
443447

0 commit comments

Comments
 (0)