Use case for multiple docker containers of same image running concurrently a command

Hello,

I defined an docker linux image that runs a complex command and then exits. It is important that it exits after each execution - I want it that way.

The system receives requests to execute this command through the docker container. There can be multiple concurrent requests and the command can take a few seconds to run, let’s say up to 10s. In order to help with the concurrency aspect I want to have a number of containers based on the same image that can run this command at the same time, let’s say 3.

So, I can have this scenario:

  • request 1 -> run on container 1
  • request 2 -> run on container 2
  • request 3 -> run on container 3
  • request 4 -> wait until one of the 3 containers finishes
  • request 1 is done
  • request 4 -> run on container 1
  • request 2 is done
  • request 3 is done
  • request 5 -> run on container 2
  • request 4 is done
  • request 5 is done.

Is there anything in the docker family that can help with this use case, or should just I simply handle it myself? It’s not that hard, I will define a fixed number of containers and the app keeps track of which one is running and it runs a request as soon as a container is “free”.

Thank you!

Then a more elegant way would be to have queue (be it rabbitMQ or anything like it).
Have your apps submit to this queue and have your workers containers listening to this and pick the next jobs once the previous is done, or wait for the next request.
This way you wont have to handle the containers yourself and have only 3 job running at a time. Beside, if you want to change this limit to 4 (or 10 or 2), then all you’ll need to do is to scale the worker service (instead of changing your code)

Thank you for your message. I was thinking of using a queuing product as well (RabbitMq sounds like a very good option) - you confirmed my thoughts.