Docker Community Forums

Share and learn in the Docker community.

How to impove this container?

Hello everyone,

I am quite new to docker, so please bare with me on this one:

I have the following container up and running and it suits my use purposes quite nicely. I am, however, concerned with what I’ve done, as I’ve read around that installing applications and software inside a container is considered bad practice. The mantra, as far as I understand, goes along the lines of “one container for one application”. In my case I’ve taken a Rstudio container (rocker/rstudio) from the hub and installed the other needed software (python, tensorflow) inside. As far as I understand this is not the proper way of going about this. Can someone with more experience elaborate on how they would construct this container? I am especially interested in the possibility to use the tensorflow gpu CUDA docker image.

So far I’ve tried putting everything in a docker-compose, but this didn’t make each image available from compose. I’ve also tried wiring them over a network, but still didn’t produce the result I’m looking for. Just for clarification purposes - the result is to be have all of the separate images working and communicating between each other.


They way you would do it, is to create a dockerfile, which is basically a recipe for a docker image.

So, fx. you define in the dockerfile, that the base image, is from rocker/rstudio, like this (just a basic text file named: dockerfile):

FROM: rocker/rstudio:latest
RUN the-command-you-used-to-install-those-things

Then build your image:

docker build -t myimage .

Now, you can run with your new image, instead of the standard rstudio one, so if you ran the container like this:

docker run --rm -p 8787:8787 -e PASSWORD=yourpasswordhere rocker/rstudio

you can now instead do:

docker run --rm -p 8787:8787 -e PASSWORD=yourpasswordhere myimage

And if you ever need to make changes to the image, you just edit the dockerfile, build, run :slight_smile:

Hi terpz, thanks for your reply and interest to my silly question. I have indeed gone about it in the very same way you outline - please find below a schematic illustration of the container.

As you can see its the rstudio base, install some libraries, then install python, some libraries again, then tensorflow. Then all of this container is linked through a network to two other containers where we have the database in one and the client in the other. My question thus is - is the left container properly configured and shouldn’t each layer be its own container? From your message I interpret that what I’ve done is not so wrong after all?

I think its fine, also i think it would be too complex if you decided to split it up, if its even possible.

The “problem” is, that some people tend to stack too much into a container, fx mysql, apache, php, phpmyadmin into one container, which makes no sense and is terrible to maintain, but will it work? sure, but its not the docker way :slight_smile:

1 Like