Skip to content

LLMProxy#271

Merged
andri-coral merged 10 commits intomasterfrom
feat/proxy
Apr 16, 2026
Merged

LLMProxy#271
andri-coral merged 10 commits intomasterfrom
feat/proxy

Conversation

@andri-coral
Copy link
Copy Markdown
Collaborator

No description provided.

@andri-coral andri-coral requested a review from a team March 31, 2026 23:04
@andri-coral andri-coral changed the base branch from master to feat/prototype-runtime March 31, 2026 23:04
@andri-coral andri-coral requested review from CaelumF and seafraf March 31, 2026 23:05
Comment thread src/main/kotlin/org/coralprotocol/coralserver/llmproxy/LlmProviderStrategy.kt Outdated
Comment thread src/main/kotlin/org/coralprotocol/coralserver/routes/api/v1/LlmProxyApi.kt Outdated
@CaelumF
Copy link
Copy Markdown
Collaborator

CaelumF commented Apr 2, 2026

left some comments, I wonder if we should leave more things to koog rather than doing it ourselves? we could reuse a lot of their testing+fixing that way.

also do you have any thoughts on the string typing and null safety here? feels maybe brittle but I wonder what you think @andri-coral

@andri-coral
Copy link
Copy Markdown
Collaborator Author

left some comments, I wonder if we should leave more things to koog rather than doing it ourselves? we could reuse a lot of their testing+fixing that way.

the proxy serves all runtimes (python, docker, etc), not just prototype/koog. for prototype agents there's some overlap in token tracking but they serve different purposes (server-side telemetry vs agent-side budget)

@andri-coral andri-coral requested a review from CaelumF April 8, 2026 11:01
@andri-coral
Copy link
Copy Markdown
Collaborator Author

andri-coral commented Apr 8, 2026

also do you have any thoughts on the string typing and null safety here? feels maybe brittle but I wonder what you think @andri-coral

improved

@seafraf
Copy link
Copy Markdown
Collaborator

seafraf commented Apr 8, 2026

Not a big deal, but could you (for consistency):

  • Replace call.respond instances with throw RouteException
  • Use lowerCamalCase in test names
  • Not use infix notation for kotest validators (e.g a.shouldBe(b) instead of a shouldBe b)

@seafraf
Copy link
Copy Markdown
Collaborator

seafraf commented Apr 8, 2026

Following our conversation yesterday, it seems best if we avoided doing URL construction on the agent side and instead communicated via the coral-agent.toml what the agent would like to see for the CORAL_PROXY_URL environment variable.

To allow the agent to request proxy endpoints in a specific format, I am thinking they could specify it like this:

[[llm.proxies]]
name = "MAIN"
format = "openai"
model = "gpt-4.1"

[[llm.proxies]]
name = "ANTHROPIC"
format = "anthropic"
model = "claude-something"

The agent would then be launched with:

CORAL_PROXY_URL_MAIN = https://<address>:<port>/api/v1/llm-proxy/<secret>/openai/...
CORAL_PROXY_URL_ANTHROPIC = https://<address>:<port>/api/v1/llm-proxy/<secret>/anthropic/...

name would have to be unique, alphabetic and uppercase
format would be openai/anthropic/something else in the future
model would be allow the agent to express a desired model, but in the future we may allow the server owner (and maybe even the app dev) to configure an override

If an agent has no proxies listed, no proxy URL will be passed. To discourage people from not configuring this proxies list, a warning should be placed in the marketplace view for an agent (which will also be available for all agents in coral console in the future) stating that the agent does not support LLM proxying and this will disable certain features (e.g telemetry).

@andri-coral andri-coral changed the base branch from feat/prototype-runtime to master April 9, 2026 15:50
Copy link
Copy Markdown
Collaborator

@CaelumF CaelumF left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, let's go! :shipit:

@andri-coral andri-coral merged commit b557ced into master Apr 16, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants