So, I know the preferred practice is to have a single application/service/executable running in one container, and to generally have a separation of concerns where possible.
I’m not sure where the Gitlab fits into this because it has quite a bit of stuff bundled under the hood. I am also in a position where I would like to extend it to have subgit (a utility which maintains a mirror between an SVN repo and a Git repo).
I can successfully install the subgit pre-reqs ( default-jre ), download the subgit installer and add it to my running gitlab container – but obviously this is lost from a docker-compose up/down. I want to know how should I be maintaining this integration?
I tried to create a second container outside of gitlab, but I started to worry about sharing the git repo data folder from gitlab to another container, for fear they might conflict. Subgit wants to run as the ‘git’ user in the gitlab container, and it kicks off a daemon file to handle git pushes, and continually pings the SVN repo for changes every 60 seconds.
I also considered creating a ‘docker commit’ from my running gitlab container once I have installed default-jre and setup gitlab – but does this mean I would miss out on updates to the gitlab-ce image in the future?
And finally, I considered adding a shell script that would check if subgit was installed, and if not, install it, and figuring out a way to hook this into the gitlab container startup. This seems slow, and every time I restart the gitlab-ce container it would have to basically re-setup subgit, but it should be very rare that this even needs to happen.
Anyway, I would really appreciate any insight as to a best practice for approaching this problem. I’m still relatively new to docker as well, so I might be entirely missing a great option or approach!