Multiple applications in a single container!


Can i run multiple applications(ex: Python Web App, MySQL etc) in a single Docker container?


You can, but you have to build all the mechanism to keep all of the processes running yourself. Supervisord seems to be a popular solution.

I think it’s generally considered a best practice to run only one “thing” in a container. There are a couple of reasons for this. One is just that it’s simpler (two docker run commands or a docker-compose.yml file, rather than trying to manage supervisord). Another is that you’ll probably need to routinely update your application, and if the application and the database are in separate containers you don’t need to restart one part just because the other’s changed.

If you’re looking at a multi-host setup (like Docker Swarm or Kubernetes) then keeping the database and application separate becomes more important. The database needs some special care; you can’t replicate it and it’s tricky to move it across hosts. If the application is stateless (up to its database connection) you can run multiple copies of it, and you don’t especially care where they run. Those are different enough that you really want them to be separate things so that you can manage them separately.

1 Like

Thanks David for your reply.

Agree with David, one container pr service is :metal: and should always be the way you build your containers.


There might be reasons for this not to work, and as David said, supervisor is (for me) the answer.
I just put an example on another post:

( you might need to open that, the format is not great here i can see :joy: )

There was an interesting discission on StackOverflow on that topic:

I also agree with David. One running application/service per container. David lists some great reasons as to why do set it up that way. The only remaining question is probably how. How to have the database and application needing to connect to the database in separate containers. The answer is through networking.

You should create a new Docker network, and when staring up both containers, assign them to this network. That way, for either to talk to the other, it can be done via there Docker container name instead of IP, which could change.

docker network create --driver bridge

Then, when you run the container, you give it the network to run in.

docker run --network= --name=<container_name> :

Now, in each container started in this network, you can reference the other containers via the container name, and Docker will handle the communication between the containers.

Hope this helps.

what happens when an application spawns a different process ? will this process run inside the original container, or does a separate container need to be involved?

Whatever you execute in the context of our container will run “inside the container”. You can start as many backend process as your desire, though be aware that you need to pay attention about handling signals on them yourself. Otherweise a constainer stop will not stop those process properly and will kill them hard after the graceperiod.

It is not uncommon to have entrypoint scripts that apply your environment variables to configurations files before the main service is started.

Best practices recommends to only run 1 concern in a docker container as others in this thread have also stated. Separating the applications and services in multiple containers gives you better scalability, availability and recovery, and rolling updates. But there are cases where you might want to run multiple concerns in a single container. Docker does have documentation on Running multiple services in a container at this URL:

1 Like

How un python script with MySql or mariaDb in docker container on raspberry pi 3 b stretch armhf
docker pull mysql (not working)
docker pull mariadb (not working)
Any idea???
To start from scratch
Using Dockerfile or docker-compose.yml