How to link my ollama with my app in docker

I have built an app which is linked with ollama 3.1 its working fine on my laptop when i installed ollama there but when i dockerize it using docker compose both images are running but my app is not linking with ollama image and i am getting this error
2024-10-29 01:48:43 streamlit_app-1 | requests.exceptions.ConnectionError: HTTPConnectionPool(host=‘localhost’, port=11434): Max retries exceeded with url: /api/generate/api/generate (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0x7f7e42d76f80>: Failed to establish a new connection: [Errno 111] Connection refused’))

Also when i run docker ps it says my ollama image is unhealthy

79db655beaed ollama/ollama “/bin/ollama serve” 11 minutes ago Up 11 minutes (unhealthy) 0.0.0.0:11434->11434/tcp dockerize-ollama-1

my docker file

version: “3”

services:
streamlit_app:
build: ./ # Assuming your Dockerfile for Streamlit is in the current directory
ports:
- “8501:8501” # Streamlit’s default port
volumes:
- ./:/app # Map the current directory to /app in the container
networks:
- tutorial-net
environment:
- LANGCHAIN_TRACING_V2=true
- LANGCHAIN_API_KEY=${LANGCHAIN_API_KEY} # Assuming you have this in a .env file
depends_on:
- ollama # Ensures Ollama starts before the Streamlit app
entrypoint: [“streamlit”, “run”, “locallama.py”, “–server.port=8501”, “–server.enableCORS=false”]

ollama:
image: ollama/ollama # Assuming you already have an Ollama image
ports:
- “11434:11434” # Expose Ollama’s API port
volumes:
- tutorial-vol:/ollama
networks:
- tutorial-net
command: serve # Ensure Ollama is running its service in the container
healthcheck:
test: [“CMD”, “curl”, “-f”, “http://host.docker.internal:11434”]
interval: 30s
timeout: 10s
retries: 5

networks:
tutorial-net:
driver: bridge

volumes:
tutorial-vol:
driver: local

A. This is not a Dockerfile, but a Compose file
B. Please format this file using:
```yml

your:
  yaml: code

```