What's the best way to synchronize a python script with a docker database container?

I’m trying to use docker-composeto run a python script in one container that populates a database in a separate container. My problem is that the script launches before the database is ready to accept connections. Is there a way to avoid this and still use docker-compose?

My other alternative is to create a shell script that fires each of the docker container commands serially, but i would rather use docker-compose if possible.

Here is the docker-compose.yml file:

etl:
  build: ./etl
  links:
    - mysql
mysql:
  image: mariadb
  environment:
    MYSQL_DATABASE: my_db
    MYSQL_ROOT_PASSWORD: a_password

Here’s my work-around shell script:

#!/bin/bash
docker run --name mariadb -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mariadb:latest
docker build -t etl ./etl
docker run -it --rm -name my-etl --link mariadb:mysql etl

Kelsey Hightower wrote a good article about this recently, basically how applications should be written so that instead of attempting to serialize startup, you add retry logic that allows each component to handle its own life cycle.

Instead of trying to serially bring up components, add something like this to your Python project and use docker-compose normally.

Hope that helps!