GenAI-Stack - Connection Error with host.docker.internal port 11434

I am trying to run the new GenAI-Stack that uses streamlit, ne-4j, langchain, etc. Everything is fine and dandy (loading data, exploring data) until I try to query the database in the chat window.

requests.exceptions.ConnectionError: HTTPConnectionPool(host=‘host.docker.internal’, port=11434): Max retries exceeded with url: /api/generate/ (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0x7f8f69137410>: Failed to establish a new connection: [Errno 111] Connection refused’))

Ive done extensive research onliine and cannot find a solution to this. The most relevant thing I found was to try adding the following parameter to the bot container in the yaml

bot:
extra_hosts:
host.docker.internal: host-gateway

Unless someone comes here who uses the same software on the same platform trying to connect the same way, you wil have to share more details.

  • How you are running the aplication which tries to connect and
  • where the service is running to which it needs to connect to.
  • Is the exception coming from the client side or server side?

For me, it looks like a Python error. Am I right?

Some additional questions:

  • Can you ping host.docker.internal from the container?
  • On what IP is the service listening?

Note that if you want to use host.docker.internal the service on the host (assuming it runs on the Windows host) must listen on localhost (127.0.0.1)

Sharing your compose file or command that you used to run the container can also help to understand the issue. When you share a code, terminal output or logs, please use code blocks. More information in the following guide:How to format your forum posts

IMHO he’s trying to run GenAI-Stack with a local AI using ollama, but I am not sure genAI can handle the schema as is, I suspect he should either setup a REST API to expose his ollama url in the right format, or use vLLM otherwise GenAI is not going to parse his 11434/api/chat url natively.
It’s stab in the dark mind you.
(Hunch: maybe using ollama-webui as a middleware could solve the problem.)

1 Like