diff --git a/data/onPostBuild/llmstxt.ts b/data/onPostBuild/llmstxt.ts index a19bd614e0..3b13359fe5 100644 --- a/data/onPostBuild/llmstxt.ts +++ b/data/onPostBuild/llmstxt.ts @@ -13,7 +13,7 @@ const LLMS_TXT_PREAMBLE = `# Ably Documentation - **Global Edge Network**: Ultra-low latency realtime messaging delivered through a globally distributed edge network - **Enterprise Scale**: Built to handle millions of concurrent connections with guaranteed message delivery -- **Multiple Products**: Pub/Sub, Chat, LiveSync, LiveObjects and Spaces +- **Multiple Products**: Pub/Sub, AI Transport, Chat, LiveSync, LiveObjects and Spaces - **Developer-Friendly SDKs**: SDKs available for JavaScript, Node.js, Java, Python, Go, Objective-C, Swift, Csharp, PHP, Flutter, Ruby, React, React Native, and Kotlin `; diff --git a/src/pages/docs/platform/ai-llms/index.mdx b/src/pages/docs/platform/ai-llms/index.mdx index 74d43acf7f..94920a60b1 100644 --- a/src/pages/docs/platform/ai-llms/index.mdx +++ b/src/pages/docs/platform/ai-llms/index.mdx @@ -6,6 +6,10 @@ meta_keywords: "Ably LLM, AI documentation, CLAUDE.md, AGENTS.md, Cursor rules, Ably documentation is designed to be LLM-friendly, making it easy to use AI assistants like Claude, ChatGPT, or Cursor to help you build realtime applications. + + ## Available resources Ably provides two key resources optimized for LLM consumption: @@ -126,11 +130,11 @@ Use **product and feature docs** for: When a product-specific abstraction exists, **always prefer its documentation over generic Pub/Sub docs**, even if the underlying concepts overlap. Priority order: -1. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation -2. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI -3. **LiveObjects docs** – for realtime shared state (maps, counters) -4. **LiveSync docs** – for database-to-client synchronization -5. **AI Transport docs** – for AI token streaming, durable sessions, bi-directional AI communication +1. **AI Transport docs** – for AI/LLM token streaming, sessions, human-in-the-loop, tool calls, multi-device +2. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation +3. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI +4. **LiveObjects docs** – for realtime shared state (maps, counters) +5. **LiveSync docs** – for database-to-client synchronization 6. **Generic Pub/Sub docs** – only when no higher-level abstraction applies Do not rebuild Chat, Spaces, LiveObjects, or AI Transport behavior directly on raw channels unless explicitly requested. @@ -183,6 +187,7 @@ Use placeholders for secrets: Explicitly choose the correct interface and explain why: +- **AI Transport**: realtime AI/LLM token streaming, resumable sessions, human-in-the-loop, multi-device continuity - **Realtime SDK**: realtime pub/sub, presence, collaboration - **REST SDK**: server-side publishing, token creation, history, stats - **Chat SDK**: structured chat features and moderation diff --git a/src/pages/docs/platform/ai-llms/llms-txt.mdx b/src/pages/docs/platform/ai-llms/llms-txt.mdx index 1c3624fc83..fbf848fffb 100644 --- a/src/pages/docs/platform/ai-llms/llms-txt.mdx +++ b/src/pages/docs/platform/ai-llms/llms-txt.mdx @@ -32,6 +32,9 @@ Core platform documentation including account management, architecture, pricing, ### Pub/Sub Documentation for Ably's core realtime messaging capabilities: channels, messages, presence, authentication, connections, and protocols. +### AI Transport +Documentation for Ably's AI Transport product covering token streaming, sessions and identity, messaging features such as human-in-the-loop and tool calls, and getting started guides for OpenAI, Anthropic, Vercel AI SDK, and LangGraph. + ### Chat The Ably Chat product documentation covering rooms, messages, reactions, typing indicators, and moderation features. diff --git a/src/pages/docs/platform/architecture/connection-recovery.mdx b/src/pages/docs/platform/architecture/connection-recovery.mdx index c18f6b7b00..c02bb43774 100644 --- a/src/pages/docs/platform/architecture/connection-recovery.mdx +++ b/src/pages/docs/platform/architecture/connection-recovery.mdx @@ -13,6 +13,8 @@ Ably minimizes the impact of these disruptions by providing an effective recover Applications built with Ably will continue to function normally during disruptions. They will maintain their state and all messages will be received by the client in the correct order. This is particularly important for applications where messages delivery guarantees are crucial, such as in applications where client state is hydrated and maintained incrementally by messages. +Connection recovery is especially important for AI applications, where a network interruption during token streaming can disrupt the user experience. Ably [AI Transport](/docs/ai-transport) builds on this mechanism to enable [resumable token streaming](/docs/ai-transport/token-streaming) from language models, ensuring users can reconnect mid-stream and continue from where they left off. + Ably achieves a reliable connection recovery mechanism with the following: * [Connection states](#connection-states) diff --git a/src/pages/docs/platform/index.mdx b/src/pages/docs/platform/index.mdx index bc7ab24408..9b8edd0f01 100644 --- a/src/pages/docs/platform/index.mdx +++ b/src/pages/docs/platform/index.mdx @@ -115,6 +115,14 @@ Use Ably [LiveSync](/docs/livesync) to synchronize changes between your database LiveSync automatically streams changes you make in your database to clients to keep them in sync with the source of truth in your database. +### Ably AI Transport + +Use Ably [AI Transport](/docs/ai-transport) as a drop-in infrastructure layer that upgrades your AI streams into bi-directional, stateful experiences. It provides resumable token streaming, multi-device continuity, human-in-the-loop workflows, and session management that works with any AI model or framework. + +AI Transport is built on Ably Pub/Sub. It utilizes Ably's platform to benefit from all of the same performance guarantees and scaling potential. + +AI Transport is effective for use cases such as multi-turn conversational AI applications, AI agent coordination, live steering with human takeover, and any scenario where reliable LLM token delivery and session resumability are critical. + ## Next steps * For guidance on choosing between Ably's interfaces and products, see [Product guidance](/docs/platform/products). diff --git a/src/pages/docs/protocols/sse.mdx b/src/pages/docs/protocols/sse.mdx index 4d36c09dcf..d0df184782 100644 --- a/src/pages/docs/protocols/sse.mdx +++ b/src/pages/docs/protocols/sse.mdx @@ -38,6 +38,10 @@ SSE is an excellent alternative to Ably SDK in memory-limited environments. * Access to a comprehensive range of features including, but not limited to, [publishing](/docs/push/publish), [presence](/docs/presence-occupancy/presence), [history](/docs/storage-history/history), [push notifications](/docs/push), [automatic payload encoding](/docs/channels/options/encryption), and [symmetric encryption](/docs/channels/options/encryption). * Optimal compatibility with browsers via the WebSocket protocol. + + ## Configuration The following code sample provides an example of how to use SSE with Ably: diff --git a/src/pages/docs/pub-sub/index.mdx b/src/pages/docs/pub-sub/index.mdx index 84f20a549d..1399bb0a2f 100644 --- a/src/pages/docs/pub-sub/index.mdx +++ b/src/pages/docs/pub-sub/index.mdx @@ -438,6 +438,10 @@ if err := channel.Publish(context.Background(), "example", "message data"); err + +