Bulk static data

Our application is built on a library that has a very large amount of static data. Our build process builds a base image, and then various parts of our application build on that base image, so there’s only one copy of that static data in the system, but still, it produces individual 4 GB images that are operationally difficult. ({{docker save}} winds up being a key part of our workflow, and {{docker save}}, {{gzip}}, {{scp}}, {{docker load}} on this volume is just painful.)

Is there a best practice for dealing with this? The two obvious choices are to keep a large, self-contained image, or to split out the static data into a volume such that you must use {{docker run -v}} to use the application. I can imagine a clever path where I create a side container that has an initialized volume containing the static data, launch it as a data-volume container, and then use {{docker run --volumes-from}} to get that data elsewhere. Any other suggestions?