Skip to content

[Bug] OpenAI Responses API 400 Bad Request: 'Missing reasoning item' with Reasoning Models #11994

@xxsLuna

Description

@xxsLuna

Description

While using reasoning-enabled models (such as o1-preview or o3-mini) through the OpenAI Responses API (via openai-adapters), users frequently encounter a 400 Bad Request error:

Item 'msg_...' was provided without its required 'reasoning' item: 'rs_...'.

Root Cause Analysis

The OpenAI Responses API enforces a strict "proximity/pairing" protocol for reasoning items. In a single assistant turn, a reasoning item must be strictly followed by its associated output item (usually the assistant message).

Currently, the toResponsesInput function in core/llm/openaiTypeConverters.ts emits function_call items before the text message item. For complex agentic turns involving both tool calls and a commentary message, this interleaving separates the reasoning block from the message item, violating the API's sequencing requirements and triggering the validation failure.

Proposed Solution

Adjust the item emission order in toResponsesInput to ensure the assistant's text message item is pushed before any function_call items within the same turn. This maintains the required adjacency between the reasoning item and the text response.

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:toolsRelates to tool usagekind:bugIndicates an unexpected problem or unintended behavior

    Type

    No type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions