I am quite new to docker, so please bare with me on this one:
I have the following container up and running and it suits my use purposes quite nicely. I am, however, concerned with what I’ve done, as I’ve read around that installing applications and software inside a container is considered bad practice. The mantra, as far as I understand, goes along the lines of “one container for one application”. In my case I’ve taken a Rstudio container (rocker/rstudio) from the hub and installed the other needed software (python, tensorflow) inside. As far as I understand this is not the proper way of going about this. Can someone with more experience elaborate on how they would construct this container? I am especially interested in the possibility to use the tensorflow gpu CUDA docker image.
So far I’ve tried putting everything in a docker-compose, but this didn’t make each image available from compose. I’ve also tried wiring them over a network, but still didn’t produce the result I’m looking for. Just for clarification purposes - the result is to be have all of the separate images working and communicating between each other.