Describe the bug
I have this issue when I try to discuss with my agent with Ollama config.
My config (default) :

It works in terminal but not in the desktop app
Ollama serve is running and the model is loaded
Reproduce
- Download Ollama
- Run ollama serve
- Download desktop app and follow onboarding and enable Ollama inside
- Try to discuss
Expected behavior
Discuss with my agent
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.12.6
Operating System name and version
macOS Tahoe
Additional context
No response
Describe the bug
I have this issue when I try to discuss with my agent with Ollama config.
My config (default) :

It works in terminal but not in the desktop app
Ollama serve is running and the model is loaded
Reproduce
Expected behavior
Discuss with my agent
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.12.6
Operating System name and version
macOS Tahoe
Additional context
No response