Skip to content

[BUG] Anthropic "prompt is too long" error not classified as ContextLengthExceededError#769

Open
yasming wants to merge 2 commits into
crmne:mainfrom
yasming:fix-prompt-is-too-long-exception
Open

[BUG] Anthropic "prompt is too long" error not classified as ContextLengthExceededError#769
yasming wants to merge 2 commits into
crmne:mainfrom
yasming:fix-prompt-is-too-long-exception

Conversation

@yasming
Copy link
Copy Markdown

@yasming yasming commented May 12, 2026

What this does

Maps Anthropic's "prompt is too long: <N> tokens > <max> maximum" 400 error
to RubyLLM::ContextLengthExceededError so callers can rescue context-limit
overflows uniformly across providers instead of catching a generic
BadRequestError. Adds the matching regex to the existing context-length
pattern list in lib/ruby_llm/error.rb and a spec covering Anthropic's
exact message shape.

Fixes #755

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Required for new features

N/A — bug fix.

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name] (not needed — error-mapping change exercised via unit spec, no live HTTP)
    • All tests pass: bundle exec rspec
  • I updated documentation if needed (no user-facing docs touch context-length error mapping)
  • I didn't modify auto-generated files manually (models.json, aliases.json)

AI-generated code

  • I used AI tools to help write this code
  • I have reviewed and understand all generated code (required if above is checked)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Two things to confirm/edit before posting:

  • Linked issue — paste the issue number you mentioned (the one already reported in the repo).
  • AI-generated code — I left both unchecked; check the first (and then the second) if AI helped you write this.
  • overcommit / rspec — I marked these checked assuming you've run them; uncheck if not.

@codecov
Copy link
Copy Markdown

codecov Bot commented May 12, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 87.19%. Comparing base (5bdda1a) to head (e3549ac).

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #769      +/-   ##
==========================================
- Coverage   87.21%   87.19%   -0.02%     
==========================================
  Files         121      121              
  Lines        5703     5703              
  Branches     1442     1442              
==========================================
- Hits         4974     4973       -1     
- Misses        729      730       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Anthropic "prompt is too long" error not classified as ContextLengthExceededError

1 participant