Hey guys. I’d like to contribute to this discussion, at least to raise my own concerns, and to point out that it’s not stubbornness holding me back.
Being a fan of Docker from an early point, it has seen implementation into QA testing and experimentation in my place of work. It is already being considered for Production use, however this is when things get into the ‘stable’ argument.
What is stable?
As much as I hate the argument, stable seems to many companies to be “something that’s been around for a long time that isn’t completely broken, or at least has an expected failure rate”. I could give examples here of broken SNMP implementations, Perl4, etc. but the long and short of it is, we started with Docker 0.8, we’ll die with Docker 0.8.
More reasonable people that don’t live in denial might go for ‘vendor stable’. I think of this as “this set of software is the most tested with the rest”. I do not think it is more likely to work, and certainly not likely to be more bug free (no matter how much effort goes into backporting, that magical curse that makes the software even less tested than it’s bleeding edge counterpart)
My problems are not with Docker, or with each subsequent release my company feels the need to fully regression test each component, but that Ubuntu settled on v0.9.1. This has led to some people taking the line that 0.9.1 is a version that will be around for a long time, that it is ‘supported’, therefore it must be more stable. This affects people like
- myself, who wants to spend less time running the servers/deployment and more time using the forum (discourse)
- docker hosting services, where an expected failure rate and predictability are more important than new features
The only way I see this changing is if a subsequent point release of Ubuntu upgrades the Docker version that it ships with.