TL;DR: I want to make it easier to set up functional development environments, and I am happy to implement a solution for it if one doesn’t already exist.
I want to decrease the time it takes for developers to set up their development environment using docker. In order for that to work, I need to have a reliable solution to synchronize local changes to source code to a running docker image. We use Macs, so we are using boot2docker, but it would be even better if the solution would work to sync files to a remote server, e.g. Docker Cloud, so the developer doesn’t need to waste system resources on maintaining a VM. Also noteworthy, even on local machines, the synchronization into the VM is rather finicky and clunky to set up right now.
Here’s how the ideal process for a new dev getting started would work from my perspective:
- Install Docker & Docker Compose
- Check out source code
- Configure docker client to talk to remote docker daemon set up by ops in Docker Cloud or AWS
- docker-compose up
After the last command, all of the relevant services would start, and the current directory would be mounted into a volume in the remote container so that the code updates automatically when changes happen on the local filesystem. For ease of use purposes, it’d be perfect if there was some docker-provided way to watch files on the local filesystem and sync them into volumes inside of docker. I know there are several workarounds (SSHFS, Samba, etc) but they are all clunky to set up on a per-dev basis and not exactly optimized for this particular use case.
Does anything like what I’m talking about exist? If not, is it better to do this as part of docker itself (as a plugin or similar) or should it be a wrapper? My end goal is to make the use case described above happen, and I am happy (eager even) to implement it.