What's the best approach?

let’s say the site gets busy and it needs another node
configure dockerfile to create base image (centos 6.5, nginx, php-fpm) first then download the php script?
or just upload an image with everything on it and just pull one huge file for each instance?

There’s not very much difference between the 2 approaches wrt total data downloaded.

The Docker image you build with a Dockerfile (and pull/push) is made of layers - starting with the base (in your case centos:6.5) layer.

Layers that are shared between images will only be downloaded once, so using a common Docker registry (or the Docker Hub) will mean that more of your images will have common bases.

Also, by pulling from a registry, you know that each node is running the same version of the same code - if your Dockerfile pulls libraries from the internet, rebuilding on each node may not result in identical nodes.

To my mind - using a registry and pulling images to nodes is the best way to manage deployment :smile: