You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ably documentation is designed to be LLM-friendly, making it easy to use AI assistants like Claude, ChatGPT, or Cursor to help you build realtime applications.
8
8
9
+
<Asidedata-type='note'>
10
+
If you're building AI-powered applications, see the [AI Transport](/docs/ai-transport) documentation for purpose-built infrastructure covering token streaming, session management, and human-in-the-loop workflows.
11
+
</Aside>
12
+
9
13
## Available resources
10
14
11
15
Ably provides two key resources optimized for LLM consumption:
@@ -126,11 +130,11 @@ Use **product and feature docs** for:
126
130
When a product-specific abstraction exists, **always prefer its documentation over generic Pub/Sub docs**, even if the underlying concepts overlap.
Documentation for Ably's core realtime messaging capabilities: channels, messages, presence, authentication, connections, and protocols.
34
34
35
+
### AI Transport
36
+
Documentation for Ably's AI Transport product covering token streaming, sessions and identity, messaging features such as human-in-the-loop and tool calls, and getting started guides for OpenAI, Anthropic, Vercel AI SDK, and LangGraph.
37
+
35
38
### Chat
36
39
The Ably Chat product documentation covering rooms, messages, reactions, typing indicators, and moderation features.
Copy file name to clipboardExpand all lines: src/pages/docs/platform/architecture/connection-recovery.mdx
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,8 @@ Ably minimizes the impact of these disruptions by providing an effective recover
13
13
14
14
Applications built with Ably will continue to function normally during disruptions. They will maintain their state and all messages will be received by the client in the correct order. This is particularly important for applications where messages delivery guarantees are crucial, such as in applications where client state is hydrated and maintained incrementally by messages.
15
15
16
+
Connection recovery is especially important for AI applications, where a network interruption during token streaming can disrupt the user experience. Ably [AI Transport](/docs/ai-transport) builds on this mechanism to enable [resumable token streaming](/docs/ai-transport/token-streaming) from language models, ensuring users can reconnect mid-stream and continue from where they left off.
17
+
16
18
Ably achieves a reliable connection recovery mechanism with the following:
Copy file name to clipboardExpand all lines: src/pages/docs/platform/index.mdx
+8Lines changed: 8 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -115,6 +115,14 @@ Use Ably [LiveSync](/docs/livesync) to synchronize changes between your database
115
115
116
116
LiveSync automatically streams changes you make in your database to clients to keep them in sync with the source of truth in your database.
117
117
118
+
### Ably AI Transport <aid="ai-transport"/>
119
+
120
+
Use Ably [AI Transport](/docs/ai-transport) as a drop-in infrastructure layer that upgrades your AI streams into bi-directional, stateful experiences. It provides resumable token streaming, multi-device continuity, human-in-the-loop workflows, and session management that works with any AI model or framework.
121
+
122
+
AI Transport is built on Ably Pub/Sub. It utilizes Ably's platform to benefit from all of the same performance guarantees and scaling potential.
123
+
124
+
AI Transport is effective for use cases such as multi-turn conversational AI applications, AI agent coordination, live steering with human takeover, and any scenario where reliable LLM token delivery and session resumability are critical.
125
+
118
126
## Next steps
119
127
120
128
* For guidance on choosing between Ably's interfaces and products, see [Product guidance](/docs/platform/products).
Copy file name to clipboardExpand all lines: src/pages/docs/protocols/sse.mdx
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,6 +38,10 @@ SSE is an excellent alternative to Ably SDK in memory-limited environments.
38
38
* Access to a comprehensive range of features including, but not limited to, [publishing](/docs/push/publish), [presence](/docs/presence-occupancy/presence), [history](/docs/storage-history/history), [push notifications](/docs/push), [automatic payload encoding](/docs/channels/options/encryption), and [symmetric encryption](/docs/channels/options/encryption).
39
39
* Optimal compatibility with browsers via the WebSocket protocol.
40
40
41
+
<Asidedata-type="note">
42
+
If you are streaming LLM responses to users, [AI Transport](/docs/ai-transport) provides purpose-built token streaming with resumable sessions and multi-device continuity, addressing the limitations of HTTP-based streaming.
43
+
</Aside>
44
+
41
45
## Configuration <aid="config"/>
42
46
43
47
The following code sample provides an example of how to use SSE with Ably:
If you are publishing messages to stream LLM responses to users, [AI Transport](/docs/ai-transport) provides purpose-built token streaming with resumable sessions, multi-device continuity, and human-in-the-loop workflows on top of pub-sub.
443
+
</Aside>
444
+
441
445
<Asidedata-type="further-reading">
442
446
You can find out more detail about how [channels](/docs/channels) and [messages](/docs/messages) work.
0 commit comments