Best practices to dockerize a large set of independent scripts

i use about a hundred python and shell scripts to manage many aspects of my business. some of them interact with gmail, some others with amazon, ebay and so on.
i’d like to dockerize them, but i don’t exactly know what is the best way to do it.

a single image for all of them could save a lot of time and disk space installing dependencies, but i would loose many of the benefits of using docker.

an image for each script, instead, would be great to keep the total control, but i guess it would be a nightmare for regular maintenance (how long would it take to update the base image for all of them?).

thanks for your help!


If its just scripts that you manually run at random times, i THINK i would bundle them in 1 image, so you can run: docker run -rm yourimage (fx.)

but if its scripts that runs like a daemon, i would split them up into single images, maybe from a custom base image if you have something that needs to be shared.