Skip to content

Append Anthropic path prefix when writing ANTHROPIC_BASE_URL#5160

Closed
jerm-dro wants to merge 3 commits intomainfrom
llm-setup-anthropic-path-prefix
Closed

Append Anthropic path prefix when writing ANTHROPIC_BASE_URL#5160
jerm-dro wants to merge 3 commits intomainfrom
llm-setup-anthropic-path-prefix

Conversation

@jerm-dro
Copy link
Copy Markdown
Contributor

@jerm-dro jerm-dro commented May 1, 2026

Summary

  • Envoy AI Gateway serves native-Anthropic traffic at /anthropic/v1/messages, but thv llm setup writes ANTHROPIC_BASE_URL=<gateway-url> verbatim. Claude Code's requests fall through to Envoy's catchall and the user sees "model may not exist or you may not have access" while gateway logs show 404s with auth headers populated — auth works, the path is wrong.
  • Add a persisted llm.anthropic_path_prefix setting (typed as a nullable string so nil and "" are distinguishable) and an --anthropic-path-prefix flag on both thv llm setup and thv llm config set.
  • Default the prefix to /anthropic (Envoy AI Gateway is the primary target). Users on LiteLLM or direct Anthropic opt out with --anthropic-path-prefix="". The CLI uses cmd.Flags().Changed() so an explicit empty override is persisted instead of being discarded as a zero value, mirroring the existing --tls-skip-verify pattern.
  • Proxy-mode tools (Cursor, VS Code) are unaffected because they forward OpenAI-shaped traffic through thv llm proxy on a separate route.

Closes #5158

Type of change

  • Bug fix
  • New feature
  • Refactoring (no behavior change)
  • Dependency update
  • Documentation
  • Other (describe):

Test plan

  • Unit tests (task test)
  • Linting (task lint-fix)
  • Manual testing (describe below)

Manual: ran go run ./cmd/thv llm setup --help and go run ./cmd/thv llm config set --help to confirm the flag advertises the /anthropic default and the opt-out instruction on both commands.

New unit tests cover (red→green TDD):

  • pkg/client/llm_gateway_test.go::TestConfigureLLMGateway_AnthropicPathPrefix — empty ApplyConfig.AnthropicPathPrefix produces a bare ANTHROPIC_BASE_URL (regression guard); /anthropic produces <gateway>/anthropic.
  • pkg/llm/config_test.go::TestConfig_EffectiveAnthropicPathPrefix — nil resolves to /anthropic (default), explicit empty stays empty, explicit value passes through.
  • pkg/llm/manage_test.go::TestConfig_SetFields — pointer cases assert that &"/anthropic" is accepted, &"" opts out, and bare values / query strings / shell metacharacters are rejected.

API Compatibility

  • This PR does not break the v1beta1 API, OR the api-break-allowed label is applied and the migration guidance is described above.

Does this introduce a user-facing change?

Yes. After upgrade, a fresh thv llm setup writes ANTHROPIC_BASE_URL=<gateway>/anthropic by default — the right value for Envoy AI Gateway, which is the primary deployment target. Users on LiteLLM or direct Anthropic must now pass --anthropic-path-prefix="" (or set llm.anthropic_path_prefix: "" in ~/.toolhive/config.yaml). thv llm config show reports the resolved value with a (default) or (none — direct Anthropic / LiteLLM) annotation so the active mode is visible at a glance.

Implementation plan

Approved implementation plan

Approach

Persist the prefix in LLM config (it's a property of the deployed gateway, not a per-tool setting), expose it as a flag on thv llm setup and thv llm config set, and append it only when writing ANTHROPIC_BASE_URL for direct-mode tools. Default to /anthropic to optimise for the most common deployment (Envoy AI Gateway). The optional auto-probe from the issue is intentionally skipped — the issue marks it optional, and an HTTP probe over an OIDC-protected endpoint adds meaningful complexity (timeouts, TLS skip handling, false negatives) for marginal benefit when the flag is explicit.

Changes

  1. pkg/llm/config.go — add nullable AnthropicPathPrefix *string to Config, DefaultAnthropicPathPrefix = "/anthropic", EffectiveAnthropicPathPrefix(); validate the prefix in ValidatePartial (must start with /, no query/fragment, length cap, no shell metacharacters).
  2. pkg/llm/manage.goSetOptions.AnthropicPathPrefix *string; Show annotates (default) and (none) modes.
  3. pkg/llmgateway/config.go — add AnthropicPathPrefix string to ApplyConfig (resolved value).
  4. pkg/llm/setup.go — pipe EffectiveAnthropicPathPrefix() into configureDetectedTools.
  5. pkg/client/llm_gateway.go — new AnthropicBaseURL ValueField returns cfg.GatewayURL + cfg.AnthropicPathPrefix.
  6. pkg/client/config.go — Claude Code's /env/ANTHROPIC_BASE_URL spec uses the new ValueField.
  7. cmd/thv/app/llm.go — register --anthropic-path-prefix on both setup and config set; flag default is /anthropic; use Flags().Changed() to persist explicit empty overrides.
  8. Tests as described in Test plan.

Out of scope (intentional)

  • Auto-probe — marked optional in the issue.
  • Codex direct-mode — not currently implemented in the codebase.
  • Proxy-mode upstream — issue confirms proxy-mode tools route through thv llm proxy and don't need the prefix in tool config; the proxy forwards OpenAI-shaped traffic on a separate Envoy route.

Special notes for reviewers

  • This PR contains two commits: the first introduces the flag with no default; the second flips the default to /anthropic and switches the field to *string so the empty-override case can be persisted. Both commits keep tests green.
  • The shell-metacharacter check on the prefix is defensive — the value flows into a Node-process env var, not a shell command — but it keeps surprises out of any future caller that does invoke a shell.
  • LLM commands are Hidden: true in the CLI tree, so task docs does not need to regenerate anything for this PR.

Generated with Claude Code

Envoy AI Gateway routes native-Anthropic traffic at /anthropic/v1/messages,
not /v1/messages. Without the prefix, Claude Code's requests fall through
to the catchall route and the user sees "model may not exist or you may
not have access" while gateway logs show 404s.

Add a persisted llm.anthropic_path_prefix setting and an
--anthropic-path-prefix flag on both "thv llm setup" and "thv llm config
set". When non-empty, the prefix is appended to the gateway URL before
ANTHROPIC_BASE_URL is written into Claude Code's settings file. The
proxy-mode upstream is unaffected because proxy-mode tools forward
OpenAI-shaped traffic on a separate route.

Closes #5158

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@github-actions github-actions Bot added the size/S Small PR: 100-299 lines changed label May 1, 2026
@codecov
Copy link
Copy Markdown

codecov Bot commented May 1, 2026

Codecov Report

❌ Patch coverage is 85.71429% with 6 lines in your changes missing coverage. Please review.
✅ Project coverage is 67.53%. Comparing base (dc74588) to head (2a0d83a).
⚠️ Report is 3 commits behind head on main.

Files with missing lines Patch % Lines
pkg/llm/manage.go 66.66% 4 Missing ⚠️
pkg/llm/config.go 90.00% 1 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #5160      +/-   ##
==========================================
- Coverage   67.56%   67.53%   -0.04%     
==========================================
  Files         601      601              
  Lines       61093    61130      +37     
==========================================
+ Hits        41280    41283       +3     
- Misses      16697    16730      +33     
- Partials     3116     3117       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Most ToolHive deployments target Envoy AI Gateway, which routes
native-Anthropic traffic at /anthropic. Make that the default so users
do not have to know to pass --anthropic-path-prefix=/anthropic.

To distinguish "not set, use default" from "explicitly empty" (LiteLLM
or direct Anthropic), Config.AnthropicPathPrefix becomes a *string and
SetOptions.AnthropicPathPrefix follows the same TLSSkipVerify pointer
pattern. Add EffectiveAnthropicPathPrefix() to resolve nil → default.
The CLI flags use cmd.Flags().Changed() so --anthropic-path-prefix=""
correctly persists an empty override instead of being discarded as a
zero value.

Show now reports the effective value with (default) or (none —
direct Anthropic / LiteLLM) annotations so users can tell at a glance
which mode they are in.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@github-actions github-actions Bot added size/M Medium PR: 300-599 lines changed and removed size/S Small PR: 100-299 lines changed size/M Medium PR: 300-599 lines changed labels May 1, 2026
- Copy *string value in SetFields rather than aliasing the pointer, per
  the copy-before-mutating-caller-input rule
- Note in the AnthropicPathPrefix field comment that omitempty on a
  *string applies only to nil — a non-nil pointer to "" is persisted,
  which is required for the "explicit no prefix" opt-out to survive
  restarts
- Refactor Show to compute the display string via an IIFE before
  passing it to writef, matching the immutable-assignment pattern
- Expand the ValueField enum comment in LLMGatewayKeySpec to include
  "AnthropicBaseURL"; move it to a pre-field block comment to avoid
  the continuation line dangling between two struct fields
- Fix llmValueForSpec comment style to match the adjacent "For ..." form

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@github-actions github-actions Bot added size/M Medium PR: 300-599 lines changed and removed size/M Medium PR: 300-599 lines changed labels May 1, 2026
@jerm-dro
Copy link
Copy Markdown
Contributor Author

jerm-dro commented May 4, 2026

Closing in favor of https://github.com/stacklok/toolhive/pull/5174/changes

@jerm-dro jerm-dro closed this May 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size/M Medium PR: 300-599 lines changed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

thv llm setup doesn't append provider path prefix for Envoy AI Gateway

1 participant