I’ve always been wondering how Docker works in this regards, and whether I should either make as many “RUN apt-get install” commands as possible, or if I should instead try to use as few RUN commands as possible, as these increases the number of layers (?).
RUN apt-get update
RUN apt-get install -y python-qt4
RUN apt-get install -y python-pyside
RUN apt-get install -y python-pip
RUN apt-get install -y python3-pip
RUN apt-get install -y python3-pyqt5
Is there any reason to prefer either of these approaches when setting up a Dockerfile, building an image and pushing that to Dockerhub?
Multiple RUN apt-get install lines create many extra layers (not necessarily harmful but there’s a limit), prevents you from effectively cleaning up the intermediate *.deb files and package lists, and will take longer to build since APT has a non-trivial startup time.
The only time I’d suggest separate RUN apt-get install lines is if you’re not totally sure what run-time dependencies your application has and you’re frequently editing the list of packages in the course of development. In that case, if you add another RUN line at the end of the list, the standard docker build caching will skip over all of the previous ones and save you some download/unpack time. But once you’ve gotten it working I’d fold it into a single command.