Generalized remote file sync

Hi everyone,

TL;DR: I want to make it easier to set up functional development environments, and I am happy to implement a solution for it if one doesn’t already exist.

I want to decrease the time it takes for developers to set up their development environment using docker. In order for that to work, I need to have a reliable solution to synchronize local changes to source code to a running docker image. We use Macs, so we are using boot2docker, but it would be even better if the solution would work to sync files to a remote server, e.g. Docker Cloud, so the developer doesn’t need to waste system resources on maintaining a VM. Also noteworthy, even on local machines, the synchronization into the VM is rather finicky and clunky to set up right now.

Here’s how the ideal process for a new dev getting started would work from my perspective:

  1. Install Docker & Docker Compose
  2. Check out source code
  3. Configure docker client to talk to remote docker daemon set up by ops in Docker Cloud or AWS
  4. docker-compose up

After the last command, all of the relevant services would start, and the current directory would be mounted into a volume in the remote container so that the code updates automatically when changes happen on the local filesystem. For ease of use purposes, it’d be perfect if there was some docker-provided way to watch files on the local filesystem and sync them into volumes inside of docker. I know there are several workarounds (SSHFS, Samba, etc) but they are all clunky to set up on a per-dev basis and not exactly optimized for this particular use case.

Does anything like what I’m talking about exist? If not, is it better to do this as part of docker itself (as a plugin or similar) or should it be a wrapper? My end goal is to make the use case described above happen, and I am happy (eager even) to implement it.


I think it’s been considered as something that may be supported by the custom FS implementations Docker for Mac and Docker for Windows (both new projects currently in beta) are introducing eventually, but I’m not sure it’s a priority. @mchiang might know more. In fact Kite, the original (way back primordial soup) version of Kitematic, was oriented towards doing this (development + file syncing on remote servers).

It’s a really, really, hard problem. Much trickier than it looks at first glance. If you just want your devs to be able to shove some code up on a server when they make local modifications, consider rolling some small scripts around filesystem notifications + rsync. But be prepared to deal with all sorts of fun performance, permissions, and persistence issues.

Hey Matt,

We should definitely talk. This aligns with what we are trying to do with Docker for Mac and Windows.

Feel free to contact me at

In the past, we’ve tried rsync, unison, other NFS solutions, and none of them worked that well.

@nathanleclaire, agree with all you said, especially that all the edge cases make this a hard problem. Thanks for pointing me to @mchiang, I’ll sync up with him to see if I can be of assistance.

@mchiang, I’ll follow up by email.

I’m very happy with docker-machine based development environments running on OS X 10.11 and using NFS for sharing code between host and the Docker VM. You may have a look at

Regarding Docker for Mac (up to version 1.11.1-beta12):

The custom FS implementation (osxfs) currently is not an option, because it is slow. See File access in mounted volumes extremely slow, CPU bound. Using NFS with Moby instead of osxfs would solve the problem in my opinion. This is like tools like Vagrant worked for years now with success. Alternatives like virtio-9p (VirtFS) are available too which may help. I’m curious about what’s the solution in the future.

I tried to using NFS in Docker for Mac by connecting to the Docker host (see How can I SSH into the Beta's MobyLinuxVM) but it looks the NFS kernel module is not available in the host even if according to the release notes for Beta 11 it should. Did not dig deeper here.