Does a Docker image exist for OpenDevin with Ollama integration?

Hi everyone,

I’m currently trying to set up OpenDevin with Ollama (specifically using local models like mistral or codellama). I’ve been able to get OpenDevin running via Docker, but I couldn’t find any documentation or Docker images that natively support Ollama as an LLM provider.

In the web interface, the available providers are OpenAI, Anthropic, etc. — but there’s no option for Ollama. I did add the following to my .env file:

MODEL_PROVIDER=ollama
OLLAMA_BASE_URL=http://host.docker.internal:11434

Still, no luck seeing it on the frontend. I might be missing a build step, or perhaps Ollama support is only partial or still experimental?

So, my questions:

  1. Is there any prebuilt Docker image or fork of OpenDevin that comes with Ollama support fully integrated?
  2. If not, what steps would you recommend to manually add Ollama support to an existing OpenDevin setup?

I’m fairly comfortable with Docker and Linux but still learning the internals of OpenDevin and LLM integrations — any guidance or working examples would be really appreciated!

Thanks in advance :folded_hands:

This is more an application question than a Docker question. You should check with the application community.