I’m trying to use
docker-composeto run a python script in one container that populates a database in a separate container. My problem is that the script launches before the database is ready to accept connections. Is there a way to avoid this and still use
My other alternative is to create a shell script that fires each of the docker container commands serially, but i would rather use
docker-compose if possible.
Here is the
etl: build: ./etl links: - mysql mysql: image: mariadb environment: MYSQL_DATABASE: my_db MYSQL_ROOT_PASSWORD: a_password
Here’s my work-around shell script:
#!/bin/bash docker run --name mariadb -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mariadb:latest docker build -t etl ./etl docker run -it --rm -name my-etl --link mariadb:mysql etl