Conversation
|
left some comments, I wonder if we should leave more things to koog rather than doing it ourselves? we could reuse a lot of their testing+fixing that way. also do you have any thoughts on the string typing and null safety here? feels maybe brittle but I wonder what you think @andri-coral |
the proxy serves all runtimes (python, docker, etc), not just prototype/koog. for prototype agents there's some overlap in token tracking but they serve different purposes (server-side telemetry vs agent-side budget) |
improved |
|
Not a big deal, but could you (for consistency):
|
|
Following our conversation yesterday, it seems best if we avoided doing URL construction on the agent side and instead communicated via the To allow the agent to request proxy endpoints in a specific format, I am thinking they could specify it like this: [[llm.proxies]]
name = "MAIN"
format = "openai"
model = "gpt-4.1"
[[llm.proxies]]
name = "ANTHROPIC"
format = "anthropic"
model = "claude-something"The agent would then be launched with:
If an agent has no proxies listed, no proxy URL will be passed. To discourage people from not configuring this proxies list, a warning should be placed in the marketplace view for an agent (which will also be available for all agents in coral console in the future) stating that the agent does not support LLM proxying and this will disable certain features (e.g telemetry). |
Co-authored-by: Séafra <Forder>
No description provided.