Skip to content

fix: generalize RateLimitError handling to support Anthropic models#3958

Open
r266-tech wants to merge 1 commit intocamel-ai:masterfrom
r266-tech:fix/rate-limit-error-generalization
Open

fix: generalize RateLimitError handling to support Anthropic models#3958
r266-tech wants to merge 1 commit intocamel-ai:masterfrom
r266-tech:fix/rate-limit-error-generalization

Conversation

@r266-tech
Copy link
Copy Markdown

Problem

The retry logic in and only catches , causing Anthropic (and potentially other provider) rate limit errors to propagate as unhandled exceptions.

Fixes #3882

Solution

  • Import via try/except to handle cases where the anthropic package is not installed
  • Build a tuple containing both OpenAI and Anthropic rate limit exceptions
  • Replace with in both sync and async methods

Changes

  • : +10 lines, -2 lines
    • Added try/except import block for AnthropicRateLimitError
    • Updated two exception handlers (sync + async) to use tuple

Testing

The fix is backward-compatible: when anthropic is not installed, it falls back to catching only as before.

The retry logic in _get_model_response and _aget_model_response only
caught openai.RateLimitError, causing Anthropic rate limit errors to
propagate as unhandled exceptions. This fix adds anthropic.RateLimitError
to the caught exception tuple via a try/except import that gracefully
falls back when anthropic is not installed.

Fixes camel-ai#3882
Copy link
Copy Markdown

@claude claude bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Code review skipped — your organization's overage spend limit has been reached.

Code review is billed via overage credits. To resume reviews, an organization admin can raise the monthly limit in Settings → Usage.

Once credits are available, reopen this pull request to trigger a review.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 22, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 72a3df19-e110-42d2-b2ff-2282d8dd4281

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

CodeRabbit can use TruffleHog to scan for secrets in your code with verification capabilities.

Add a TruffleHog config file (e.g. trufflehog-config.yml, trufflehog.yml) to your project to customize detectors and scanning behavior. The tool runs only when a config file is present.

Comment on lines +58 to +63
try:
from anthropic import RateLimitError as AnthropicRateLimitError

_RATE_LIMIT_ERRORS: tuple = (RateLimitError, AnthropicRateLimitError)
except ImportError:
_RATE_LIMIT_ERRORS: tuple = (RateLimitError,) # type: ignore[misc]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think we also need to think about cases that not using OpenAI or Anthropic clients

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] RateLimitError not generalizable

3 participants