I have the following setup:
A Docker container with a complete built environment . I call it a “build” Dockerfile. Size of the container is ~0.8GB. The container builds sources outside of the container. I do the docker/run or docker/start to build the sources. I mount the source trees using --volume. The sources are large GIT repositories containing ~1GB of source files and binary files.
A couple of Docker containers (“production” Dockerfiles) which contain ~40MB executables each. These containers are used in deployment
Build artefacts, mostly .so files, are ~80MB in total.
Does it make sense to use a multistage Dockerfile in the following fashion (everything is done by a single "docker build … "):
- Run git clone inside of the build container
- Build the sources in the “docker build …”
- Copy the artefacts from the “build” stage of the container to the production containers.
What are pros and cons here?
Does “multistage is a best practice” apply here?