Using Docker 1.10.3 (a legacy system I need to use at the moment). I have a container that is exported by its tag with
docker save -o myarchive.tar <repository>:<tag>. Then I take that archive and bring it to a number of computers set up (supposedly) identically (same Docker, etc…), and reload the image with
docker load -i myarchive.tar.
After this, I supposed that if I do a
docker pull <repository>:<tag> then it will find that it already have everything, and not actually pull anything. But it turns out, on some machines that’s the case (``), for some other machines it finds a few layers but then there’s a bunch of layers that it tries to repull.
Why would this behaviour be? Is there a way to transfer images from one computer to another, so that
docker pull will always find that everything is up to date? What are the requirements for that to happen?