We are developing an application where media files are created, and after that read to compose new artifacts of those files. Right now we store those files in S3, but we need to fetch plenty of files to create the artifact, so the processing speed with highly depend on the connection speed to the S3. And we actually need to fetch 30+ files per artifact creation, and the number of GET requests just slows things down.
Ok, we have an local cache of the files, so we check if files are in local cache, and otherwise we download them.
But, I wanna see if there are any other options? Are there any possibility to share data between docker cloud nodes? I assume that using volume_from only works when containers are on the same node. Or do they?
Any ideas about this? Someone must have had similar deployment issues to handle.