Ubuntu 16.04 and Docker v1.12.1
In my setup there is a container populating a sqlite db (in /tmp/mydb.db3) and a service in a Swarm should use this db. The db changes some times a day.
Which is the best way, for you, to do this.
I was thinking about using volumes, that is good for sharing the db between the various instances inside the local Docker instance, but how should I sync it with the other workers in the Swarm? I mean, which is the right way.
Thank you in advance,
In general in Docker, it’s much easier to share things by communicating across the network than by trying to share files. That’s doubly true if multiple hosts can be involved.
I’d probably just add a “real” database to your setup (if it was me, postgresql:9.5, but mariadb and mysql are also readily available) and work with the single shared database. That largely (*) gets around the shared-network-volume-driver problem. If you do wind up with multiple processes trying to update the database at once, I think SQLite has a pretty blunt concurrency model and you might get better performance from something else.
(*) (I don’t know where that database’s data is stored or if Swarm will try to move the container around the cluster; but all of the standard database images declare VOLUME in their Dockerfiles for their underlying storage and hopefully Swarm does something sensible here.)
I agree with you that a container DB should be better than shared files, but the app is not mine and has been designed with this sqlite file by now, I’m walking around it using the file as volume and rsyncing between the various workers. I was wondering if there was a best practice, in this use case.
Thank you very much for the answer.