I’m relatively new to docker and, after an initial period of try-error and other frustrating steps, i’m now quite confident in building images with Dockerfile and orchestrate two or more Dockerfile with docker-compose.
I’m here to try to understand, on high level, if what i’m trying to achieve is, in fact, doable or not:
I want to create a pipeline with Jenkins which retrieve my java code from a git repo, build, test and package the application then run the docker-compose.yaml file that orchestrate the containerization of my app in combo with a mysql custom Dockerfile.
What i did so far is to have a reliable docker-compose.yaml file which builds images from Dockerfile then run those containers togheter. Everything works as expected.
So far i’ve a cloud server with only docker running on it; docker has two “major” (pardon my terminology) container always running: one with Traefik (which acts as reverse-proxy server) and another with Jenkins. Everytime i need to push up a new application on the cloud (mainly springboot jar application on a container, mysql on another both togheter organized by a docker-compose file) i copy-paste the jar file and run the properly configured docker-compose file.
Is it feasible? I mean, speaking on high levels, is it actually possible to say to jenkins (or any other CI tool) to run the docker-compose after that he packaged the application and run all those containers on the same docker host? (so traefik can easily route the traffic?)
Hope it’s clear enough to open a discussion
Many thanks to all of you
I’ve specified the technology i’m using just to make everything more clear to all of you but my question is not strictly related to the specified stack