Skip to content

Fix Python API example showing incorrect max_tokens kwarg#4

Merged
btucker merged 1 commit intomainfrom
claude/fix-max-tokens-docs-zeGsy
Jan 2, 2026
Merged

Fix Python API example showing incorrect max_tokens kwarg#4
btucker merged 1 commit intomainfrom
claude/fix-max-tokens-docs-zeGsy

Conversation

@btucker
Copy link
Copy Markdown
Owner

@btucker btucker commented Jan 2, 2026

The old example showed max_tokens=500 and system_prompt="..." as
direct kwargs to prompt(), which was misleading. max_tokens is a
model-specific option (passed via -o max_tokens on CLI), not a
built-in LLM parameter.

Updated to show just the system prompt using the correct 'system'
parameter that LLM supports directly.

The old example showed max_tokens=500 and system_prompt="..." as
direct kwargs to prompt(), which was misleading. max_tokens is a
model-specific option (passed via -o max_tokens on CLI), not a
built-in LLM parameter.

Updated to show just the system prompt using the correct 'system'
parameter that LLM supports directly.
@btucker btucker merged commit 6d5c1d1 into main Jan 2, 2026
5 checks passed
@btucker btucker deleted the claude/fix-max-tokens-docs-zeGsy branch January 2, 2026 17:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants