Skip to content

fix: export thoughts_token_count to OpenTelemetry trace spans#4835

Open
brucearctor wants to merge 3 commits intogoogle:mainfrom
brucearctor:fix/thinking-tokens-in-traces
Open

fix: export thoughts_token_count to OpenTelemetry trace spans#4835
brucearctor wants to merge 3 commits intogoogle:mainfrom
brucearctor:fix/thinking-tokens-in-traces

Conversation

@brucearctor
Copy link

Description

Fixes #4829

ADK's OpenTelemetry tracing does not export thoughts_token_count to span attributes. When using Gemini models with ThinkingConfig, the usage_metadata in LlmResponse correctly contains thoughts_token_count, but this field is never written to spans by trace_generate_content_result() or trace_inference_result().

Interestingly, trace_call_llm() already exports this field (as gen_ai.usage.experimental.reasoning_tokens). This PR adds the same export to the two remaining functions that were missing it.

Changes

src/google/adk/telemetry/tracing.py

  • Added thoughts_token_countgen_ai.usage.experimental.reasoning_tokens span attribute export in trace_generate_content_result() (~line 746)
  • Added the same export in trace_inference_result() (~line 789)
  • Uses the same try/except AttributeError guard pattern as trace_call_llm() for backward compatibility with older SDK versions

tests/unittests/telemetry/test_spans.py

  • Added test_trace_inference_result_with_thinking_tokens — verifies the attribute is exported when thoughts_token_count is non-None
  • Added test_trace_inference_result_without_thinking_tokens — verifies no attribute is set when thoughts_token_count is None

Testing Plan

Unit Tests

All 23 telemetry tests pass:

$ pytest tests/unittests/telemetry/test_spans.py -v
23 passed in 1.08s

New tests specifically verify:

  1. thoughts_token_count=50 → span attribute gen_ai.usage.experimental.reasoning_tokens=50 is set
  2. thoughts_token_count=None → no gen_ai.usage.experimental.reasoning_tokens attribute on span

Verification

Before fix — Event.usage_metadata.thoughts_token_count is non-zero but Cloud Trace spans only show gen_ai.usage.input_tokens and gen_ai.usage.output_tokens.

After fix — gen_ai.usage.experimental.reasoning_tokens appears alongside the existing token attributes in all three tracing functions.

Add thoughts_token_count as gen_ai.usage.experimental.reasoning_tokens
span attribute in trace_generate_content_result() and
trace_inference_result(), matching the existing pattern in
trace_call_llm().

Fixes google#4829
@gemini-code-assist
Copy link
Contributor

Warning

You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again!

@adk-bot adk-bot added the tracing [Component] This issue is related to OpenTelemetry tracing label Mar 14, 2026
@rohityan rohityan self-assigned this Mar 17, 2026
@rohityan rohityan added the request clarification [Status] The maintainer need clarification or more information from the author label Mar 17, 2026
@rohityan
Copy link
Collaborator

Hi @brucearctor, Thank you for your contribution! We appreciate you taking the time to submit this pull request. Please fix the mypy-diff errors.

@brucearctor
Copy link
Author

@rohityan -- will do

- Refactor trace_inference_result() to use otel_span local variable,
  eliminating all 4 union-attr mypy errors (1 new + 3 pre-existing)
- Add tests for trace_generate_content_result() thinking tokens
- Import trace_generate_content_result in test_spans.py
@brucearctor
Copy link
Author

@rohityan going to let it run, but I think addressed the diff/new error, and solved a couple others :-)

Let me know if other concerns. Cheers -

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

request clarification [Status] The maintainer need clarification or more information from the author tracing [Component] This issue is related to OpenTelemetry tracing

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Thinking tokens not in traces

3 participants