Skip to content

docs: add sampling example and documentation#2194

Closed
ayzmkk-rongbi wants to merge 1 commit intomodelcontextprotocol:mainfrom
ayzmkk-rongbi:feat/sampling-example
Closed

docs: add sampling example and documentation#2194
ayzmkk-rongbi wants to merge 1 commit intomodelcontextprotocol:mainfrom
ayzmkk-rongbi:feat/sampling-example

Conversation

@ayzmkk-rongbi
Copy link

Summary

Add a complete sampling example (server + client) and documentation page to help users understand and use the MCP sampling feature.

Sampling allows servers to request LLM completions from connected clients, effectively "borrowing" the client's language model. While the SDK supports this feature, there was no complete working example or dedicated documentation page.

Changes

Server Example (examples/servers/simple-sampling/)

  • summarize tool: sends text to client's LLM for summarization via sampling
  • analyze_sentiment tool: requests sentiment analysis via sampling
  • Demonstrates ctx.session.create_message() with parameters (max_tokens, temperature)

Client Example (examples/clients/simple-sampling-client/)

  • Provides a sampling_callback that handles sampling/createMessage requests
  • Includes commented code showing real LLM integration (OpenAI)
  • Follows the same structure as existing examples (pyproject.toml, README, click CLI)

Documentation (docs/sampling.md)

  • Explains the sampling flow with an ASCII sequence diagram
  • Documents create_message parameters in a table
  • Shows client-side callback setup (both ClientSession and Client)
  • Includes LLM provider integration examples (OpenAI, Anthropic)
  • Covers model preferences (ModelPreferences, ModelHint)
  • References the snippet from examples/snippets/servers/sampling.py

Navigation (mkdocs.yml)

  • Added "Sampling" page under Documentation section

Testing

  • All existing tests pass (pytest tests/client/test_sampling_callback.py)
  • ruff check and ruff format pass on all new files
  • No changes to SDK source code

Fixes #1205

Add a complete sampling example with both server and client, plus a
documentation page explaining the sampling feature.

Server (examples/servers/simple-sampling):
- Exposes summarize and analyze_sentiment tools that use sampling
- Demonstrates ctx.session.create_message() for server-side LLM requests

Client (examples/clients/simple-sampling-client):
- Provides a sampling_callback to handle server LLM requests
- Shows how to integrate with real LLM providers (OpenAI, Anthropic)

Documentation (docs/sampling.md):
- Explains the sampling flow with a sequence diagram
- Documents create_message parameters
- Shows client-side callback setup
- Includes LLM provider integration examples (OpenAI, Anthropic)
- Covers model preferences

Fixes modelcontextprotocol#1205
@ayzmkk-rongbi ayzmkk-rongbi force-pushed the feat/sampling-example branch from 51cbe1b to 9b2defc Compare March 2, 2026 06:11
@ayzmkk-rongbi ayzmkk-rongbi marked this pull request as draft March 2, 2026 06:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Controlling Context in Client for Sampling Requests

1 participant