Docker Community Forums

Share and learn in the Docker community.

Database for a Dockerized application


(Negw) #1

Hi, I’m dockerizing an application that will use in both staging and production Amazon Aurora.
I want to run the tests and do local testing inside the docker image and I’d really like to have a suggestion from more experienced users (first time I use Docker…) about how to setup the database locally. Considering I’ll just pass the credentials to the Aurora instance in production is it acceptable to use my local database or should I use a Docker image? What’s the best practice? Should I use docker-compose?

Thanks in advance,
ngw


(David Maze) #2

Yes, though getting a consistent host name or IP address for the host is tricky.

If you’d like. It is probably easier on a couple of axes if you don’t already have a database lying around.

Make the database location configurable, preferably via an environment variable.

If you run the database in Docker, use an unmodified mysql or postgres image, but use the docker run -v or Docker Compose volumes: directive to inject first-boot configuration into the image.

Don’t try to create an image with a preloaded database. But you can inject a database dump into the standard database images and have it restored at first startup.

Use Docker’s built-in DNS service if you run both parts in Docker. Never use docker inspect to find a container’s internal IP address, it is not useful outside of Docker and will change if a container is ever deleted and recreated.

Your containers will be deleted eventually. Plan ahead to not lose data when this happens.

If you’d like. It’s convenient, particularly since running containers tend to need many options and the one YAML file is a handy place to list all of them. Just remember that it’s not the only game in town – Kubernetes has a different orchestration setup, and since you’re on Amazon, ECS is different again.


(Negw) #3

About the IP, I’m using “docker.for.mac.localhost”.
It’s working pretty well, thanks for the help.