Skip to content

feat: add OpenAI Responses routing for chat#770

Open
afurm wants to merge 1 commit into
crmne:mainfrom
afurm:af/openai-responses-routing
Open

feat: add OpenAI Responses routing for chat#770
afurm wants to merge 1 commit into
crmne:mainfrom
afurm:af/openai-responses-routing

Conversation

@afurm
Copy link
Copy Markdown
Contributor

@afurm afurm commented May 12, 2026

What this does

Adds OpenAI Responses API support behind the existing RubyLLM.chat interface.

This keeps Chat Completions as the default path in :auto, then routes native OpenAI requests to Responses when the request requires it, including native Responses tools like web_search, deep-research models, Responses-only params, and GPT-5 tool calls with reasoning enabled.

The change also adds:

  • config.openai_api_mode = :auto | :chat_completions | :responses
  • per-request override via with_params(openai_api_mode: :responses)
  • Responses payload rendering for messages, files, schemas, reasoning, Ruby tools, and native OpenAI tools
  • Responses parsing for output text, tool calls, reasoning summaries, and usage
  • semantic Responses streaming event handling
  • docs for routing and native tool usage

Closes #213.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Required for new features

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

Validation run:

  • bundle exec rspec spec/ruby_llm/providers
  • focused OpenAI Responses/config/streaming specs
  • bundle exec rubocop --ignore-unrecognized-cops ...
  • git diff --check

VCR cassettes were not re-recorded in this branch.

AI-generated code

  • I used AI tools to help write this code
  • I have reviewed and understand all generated code (required if above is checked)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Adds a new OpenAI configuration option, openai_api_mode, and a provider-specific per-request routing param with the same name.

@codecov
Copy link
Copy Markdown

codecov Bot commented May 12, 2026

Codecov Report

❌ Patch coverage is 90.29536% with 23 lines in your changes missing coverage. Please review.
✅ Project coverage is 87.25%. Comparing base (4942d6c) to head (a84517d).
⚠️ Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
lib/ruby_llm/providers/openai/responses.rb 86.51% 12 Missing ⚠️
lib/ruby_llm/providers/openai.rb 92.85% 5 Missing ⚠️
lib/ruby_llm/providers/openai/streaming.rb 92.85% 3 Missing ⚠️
lib/ruby_llm/providers/openai/tools.rb 88.88% 2 Missing ⚠️
lib/ruby_llm/stream_accumulator.rb 83.33% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #770      +/-   ##
==========================================
+ Coverage   87.05%   87.25%   +0.19%     
==========================================
  Files         119      120       +1     
  Lines        5594     5820     +226     
  Branches     1407     1475      +68     
==========================================
+ Hits         4870     5078     +208     
- Misses        724      742      +18     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OpenAI Responses API support

1 participant