My use case is that I want an easy way to set up postgres and the various addons (postgis being the most important) and the data to persist between container launches. I’ve found that the easiest solution to that to be running postgresapp at the version my apps use.
In the past, I’ve tried using shared volumes to persist data, but postgres has rather strict rules on ownership which have proven to be impossible to make work. I’ve tried in different configurations of host -> vm -> container to just vm -> container and none have worked as expected.
Making this bit more complicated is some of the containers need to have shared data between them in the database – they are using the same database tables and the database is their integration point. While far from ideal, and frankly a bad design, it’s what I have to work with at the moment.
Then there is the issue of tooling. It’s straightforward to dump a database remotely and reload it locally, and lots of documentation around it. In addition, most of my data is on heroku and they provide a command just for that scenario which just works. Adding docker to the mix makes it that much more complicated and requires me to figure it out, and I’m no sysadmin or database admin.
Just to top it off, the database I need to have on hand is ~25GB in size. So doing a fresh db load on each container startup isn’t possible.
Finally, whatever solution that is used needs to be easily replicated by my entire team so they can get their work done as well.
So really, it came down to do I spend an unknown amount of time setting up postgres in docker (when I don’t manage databases in production, and pay someone to deal with that pain) or do I spend 10 minutes to download postgresapp, run the heroku command they provide and have the container hit the host machine.