RubyLLM::MCP is a Ruby client for the Model Context Protocol (MCP), built to work cleanly with RubyLLM. Aiming to be completely spec compliant.
Use MCP tools, resources, and prompts from your RubyLLM chats over stdio, streamable HTTP, or SSE.
Protocol support: Fully supports MCP spec 2025-06-18 (stable), with draft spec 2026-01-26 available.
Our goal is to be able to plug MCP into Ruby/RubyLLM apps as easily as possible.
RubyLLM::MCP gives you that:
- Ruby-first API for using MCP tools, resources, and prompts directly in RubyLLM chat workflows
- Stable protocol track by default (
2025-06-18), with opt-in draft track (2026-01-26) - Built-in notification and response handlers for real-time and interactive workflows
- MCP OAuth 2.1 authentication support (PKCE, dynamic registration, discovery, and automatic token refresh)
- OAuth setup paths for Rails apps (per-user connections) and browser-based CLI flows
- Straightforward integration for any Ruby app, background job, or Rails project using RubyLLM
# Basic setup
require "ruby_llm/mcp"
RubyLLM.configure do |config|
config.openai_api_key = ENV.fetch("OPENAI_API_KEY")
end
client = RubyLLM::MCP.client(
name: "filesystem",
transport_type: :stdio,
config: {
command: "bunx",
args: ["@modelcontextprotocol/server-filesystem", Dir.pwd]
}
)
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("Find Ruby files modified today and summarize what changed.")# Resources
resource = client.resource("release_notes")
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_resource(resource)
puts chat.ask("Summarize release notes for the team in 5 bullet points.")# Prompts
prompt = client.prompt("code_review")
chat = RubyLLM.chat(model: "gpt-4.1-mini")
response = chat.ask_prompt(
prompt,
arguments: {
language: "ruby",
focus: "security"
}
)
puts response# Handlers (response + notifications)
client.on_progress do |progress|
puts "Progress: #{progress.progress}% - #{progress.message}"
end
client.on_logging do |logging|
puts "[#{logging.level}] #{logging.message}"
end
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
chat.ask("Run a repository scan and summarize risks.") do |chunk|
print chunk.content
end# OAuth setup (Rails and CLI)
# Rails: per-user OAuth client (after running rails generate ruby_llm:mcp:oauth:install User)
client = current_user.mcp_client
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("What changed in my connected repos this week?")
# CLI: browser-based OAuth flow
cli_client = RubyLLM::MCP.client(
name: "oauth-server",
transport_type: :streamable,
start: false,
config: {
url: ENV.fetch("MCP_SERVER_URL"),
oauth: { scope: "mcp:read mcp:write" }
}
)
cli_client.oauth(type: :browser).authenticate
cli_client.start
puts RubyLLM.chat(model: "gpt-4.1-mini").with_tools(*cli_client.tools).ask("List my open tasks.")
cli_client.stop- Tools: Convert MCP tools into RubyLLM-compatible tools
- Resources: Work with resources and resource templates in chat context
- Prompts: Execute server prompts with typed arguments
- Transports:
:stdio,:streamable, and:sse - Client capabilities: Sampling, roots, progress tracking, and elicitation
- Handlers: Built-in notification and response handlers for real-time and interactive workflows
- MCP Authentication: OAuth 2.1 support with PKCE, dynamic registration, discovery, and automatic token refresh
- OAuth setup paths: Rails per-user OAuth setup and browser-based OAuth for CLI tools
- Extensions: Global/per-client extension negotiation, including MCP Apps
- Multi-client support: Manage multiple MCP servers in one workflow
- Protocol control: Stable default with explicit draft opt-in
- Adapters: Native
:ruby_llmadapter (full feature set) and optional:mcp_sdk
Add to your Gemfile:
gem "ruby_llm-mcp"Then run:
bundle installIf you want the official SDK adapter, also add:
gem "mcp", "~> 0.7"rails generate ruby_llm:mcp:installFor OAuth-based user connections:
rails generate ruby_llm:mcp:oauth:install UserOAuth quick example:
client = RubyLLM::MCP.client(
name: "oauth-server",
transport_type: :streamable,
start: false,
config: {
url: ENV.fetch("MCP_SERVER_URL"),
oauth: { scope: "mcp:read mcp:write" }
}
)
client.oauth(type: :browser).authenticate
client.start
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*client.tools)
puts chat.ask("What should I prioritize today?")
client.stopThen use explicit connection blocks in jobs/controllers/services:
RubyLLM::MCP.establish_connection do |clients|
chat = RubyLLM.chat(model: "gpt-4.1-mini")
chat.with_tools(*clients.tools)
chat.ask("Analyze this pull request and list risks.")
endIssues and pull requests are welcome at patvice/ruby_llm-mcp.
Released under the MIT License.