Is Docker a good solution for standardizing build environments?

Hey there,

I currently find myself in a new role as my company’s first DevOps engineer after years as a developer, and am looking into how Docker can benefit aspects of our CI/CD pipeline as well as our developer’s daily workflow.

We have a system consisting of three CPUs running ARM and PPC architectures, so we need to cross-compile our applications for these systems. Our current solution is to have a virtual machine running Linux, and it contains each of the three toolchains. The VM is used by developers on their local machines, and it is also a base for a build server, which runs a build once per day.

This workflow is okay, but it is slow and a bit annoying having to either install development tools (editors) into the virtual machine or copy between the VM and the host (Windows). We run into the usual disk space issues with the VMs on the local machines. We are a fairly small company with limited hardware resources, so I’m also concerned about the scalability of our build servers.

From what I read it sounds like Docker can help overcome some of these issues. However, I don’t see much documentation about having Docker setup a build environment that developers can then go run variations of ‘make’ commands. Instead I see a lot of Dockerfiles that build and then run a single application, usually a server. I know it’s possible to get a shell using the “-it” flags to ‘docker run’, but is this idiomatic usage? Is this a case where a VM is a better fit for the problem?