I am currently configuring my containers logging so that it will go to splunk.
At the moment there are basically two options:
json-file logging driver for all containers
a) logfiles for all containers will be available in subdirectories of /var/lib/docker in json format and splunk will be able to read them from the filesystem
b) the problem with this approach is that on splunk level I can use a wildcard like /var/lib/docker/containers//-json.log so that splunk will read logfiles for all containers and I would like to send log data to splunk only for choosen containers
c) in a dynamic infrastructure where containers move between physical servers (we use swarm for cluster management) I cannot point splunk to read logfile for a choosen container only because if the container moves to another host the host and logpath will change
splunk logging driver:
a) uses url of splunk’s http event collector which is not available in the version of splunk which we use and update of splunk will not happen soon (other team)
So overall I think that when the splunk team will update splunk to a version with the event collector I will use splunk driver for choosen containers and json-file (default) for the rest. But until the update will happen I wonder if there is a way maybe to separate json logs for a part of containers so that they are stored in a different location than /var/lib/docker/containers? Or maybe you see other solution?