I just recently started to learn about Docker, mostly out of desire to improve my companies efficiency. I work for a web development company that develops custom web applications for clients. All of the sites operate off a common base, then are customized to fit client needs.
Right now, we have about 6 servers that each host around 50-60 sites each. Each server runs on Windows Server 2012 and have their own versions of PHP/MySql. Getting them to update PHP and MySql is like pulling teeth. As you can imagine, this can be very frustrating as a developer. It also doesn’t help that we don’t manage these servers, they are maintained by our parent companies IT.
We would also like to switch to using Linux to host our apps. The IT group is unfamiliar with Linux and seemingly unwilling to learn.
My hope was that we could use Docker so that we could deploy our apps in the environment that we want. That way we could make sure that all of our apps are running the the same version of PHP/MySql, as well as allow use to start (yes, I said start) using tools like Composer. In our current set up, we have to go through a vetting process to get something like Composer set up, and even then it’s not a guarantee they will have the bandwidth to do it.
So TL;DR, is Docker the solution to our problem? Would it make sense to have Docker installed on all of the servers and deploy several apps/sites in the same container?
Thanks for your time.
PS I want to stress that I am a Docker noob, so be gentle
PSS My company is sending a couple of us to DockerCon2018, so maybe some of my questions will be answered there.
If your IT group is not willing to migrate to it then don’t. There’s no business value at that point. Docker adds quite a bit of complexities to the whole setup and what it does is push more of the operations aspects closer to the developer groups.
With that in place, that means cost of running the operation may be higher than any benefit you’d get especially since a devops resource is likely more expensive for you to sustain.
Keeping versions in sync/up to date gets more complicated since you have to manage not only the middleware but the OS that the container is running on if it is a custom built container. Or you have to have a good trusted source for the containers you are using.
When talking about this topic of letting developers manage the versions, I just have to say
You could split each client’s site up into three parts: the common (PHP) base; their customizations (installed applications, static page content); and any data that goes with that (the content in MySQL). Then you could build one base image, then build an image based off of that per site, and deploy that with some persistent storage.
You would need to have some workflow around being able to rebuild and redeploy the per-site images when their content changes, and if the underlying base image changed you’d need to be able to rebuild everything. From your point of view, the infrastructure and database versions would be managed per application and not per server, and you’d be able to take an update for just one site without affecting anything else.
That all having been said, it sounds like a very different deployment model than what you currently have, and so you’d have to identify a real problem with your current setup that adopting Docker could solve for you.
In my experience, having a solid handle on Linux and shell scripting fundamentals is a prerequisite for doing anything with Docker. That having been said, I know there are Windows-native containers these days and I see posts around here mentioning PowerShell, so if you have expertise with the Windows command-line tools that could work.