How to access a volume inside a docker container from outside docker?

Hi,

I want to run some Spark jobs that should all log to the same directory so that the history server can pick up on those and I can see my jobs. This works great in Docker, but I also want to launch some jobs from outside the “cluster” and have the log output from my non-docker client (where my IDE is running) in this docker directory as well.

How can I mount a volume from a docker image locally?

Full docker-compose.yml here: https://github.com/marianokamp/neartime/blob/master/e2e/docker/docker-compose.yml

But here is the logfiles service (a) that I then use in other services (b).

(a)
logfiles:
    image: alpine 
    volumes:
      - /tmp/spark-events

(b) 
worker:
   .. 
   volumes_from:
      - logfiles

I did google the issue but could only find that I can expose a local directory on my host machine to the docker containers, but I want to do it the other way round.

As you can see I am new to Docker. Maybe I am looking at this the wrong way.

The alternative I see is creating a new container that mounts my local IDE’s working directories and then run the dev code inside the container. This way I could just use the same procedure as above to get access to the shared log directory.

Would copying the log files to/from the container work.
I also thought shared folders where two way.

This might also help: