Skip to content

fix: handle empty xAI streams without crashing#4015

Open
shaun0927 wants to merge 2 commits intocamel-ai:masterfrom
shaun0927:fix/xai-empty-stream
Open

fix: handle empty xAI streams without crashing#4015
shaun0927 wants to merge 2 commits intocamel-ai:masterfrom
shaun0927:fix/xai-empty-stream

Conversation

@shaun0927
Copy link
Copy Markdown

Related Issue

Fixes #4013

Description

XAIModel._stream_to_chunks() references response after the streaming loop even when the iterator yields no chunks.

This PR initializes the accumulated response state before iteration and keeps the post-loop bookkeeping guarded so empty streams return cleanly instead of crashing with UnboundLocalError.

Changes:

  • initialize response = None before iterating
  • guard the post-loop block with if response is not None:
  • add a regression test for the empty-stream path

What is the purpose of this pull request?

  • Bug fix
  • New Feature
  • Documentation update
  • Other

Checklist

  • I have read and agree to the AI-Generated Code Policy (required)
  • I have linked this PR to an issue (required)
  • I have checked if any dependencies need to be added or updated in pyproject.toml and run uv lock
  • I have updated the tests accordingly (required for a bug fix or a new feature)
  • I have updated the documentation if needed
  • I have added examples if this is a new feature

Test evidence

pytest -q test/models/test_xai_model.py

Passed locally:

  • 1 passed

The xAI streaming adapter referenced the loop variable after iteration even when the iterator yielded no chunks. Initializing the accumulated response state avoids an UnboundLocalError and locks the behavior with a regression test.

Constraint: Keep the existing streaming protocol unchanged for non-empty streams
Rejected: Catch UnboundLocalError at call sites | hides the adapter bug instead of fixing it
Confidence: high
Scope-risk: narrow
Reversibility: clean
Directive: Streaming helpers should tolerate empty iterators and post-loop bookkeeping must handle that path explicitly
Tested: pytest -q test/models/test_xai_model.py
Not-tested: Live xAI SDK streaming against a real service
Copy link
Copy Markdown

@claude claude bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Claude Code Review

This pull request is from a fork — automated review is disabled. A repository maintainer can comment @claude review to run a one-time review.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 16, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 108e8bee-e1df-41a9-b03e-1fd87c2b0cf3

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Constraint: Preserve the behavior fix exactly while satisfying repository formatting hooks
Rejected: Skip formatter and rely on maintainers | leaves the PR blocked on avoidable CI noise
Confidence: high
Scope-risk: narrow
Reversibility: clean
Directive: Run Ruff formatting on touched Python files before pushing small bug-fix PRs
Tested: ruff check camel/models/xai_model.py test/models/test_xai_model.py; pytest -q test/models/test_xai_model.py
Not-tested: Live xAI SDK integration
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] XAIModel._stream_to_chunks crashes on empty stream

1 participant