Mount EFS in a task

Hi,

I hope someone can help me.

The question is, where are the EFS mounted in the workers? How can I mount this in an D4AWS task (not a service)? I only find references on creating services and running volumes with cloudstor and other things but I need the most simple solution.

I have two cases where I need this point:

I am running JupyterHub connecting with Docker Swarm running on Mesos. Jupyter can start notebooks on the Swarm/Mesos cluster all works fine. The problem is the DockerSwarm mesos support is deprecated so we need to upgrade the setup.

I’m trying to use D4AWS as the new backend for JupyterHub and I have a question about the EFS. In my setup I had an EFS mounted on every node ona specific path /efs. So when I started the notebooks I mounted the /efs into the container so when user saved his files all was stored in the EFS and nothing was lost. Basically:
“docker run -v /efs/user:/home/user/work jupyter-notebook…”

I don’t want to build a new runner, I just want to use the same dockerspawner class. I have been able to run
notebooks in the same server, in remote servers, in the docker cluster so I guess it can be done in D4AWS.

Another case, if I want to run a command that backups all the EFS into an S3 bucket, I would like to run an AWS CLI container like
docker run -v /myefsmount:/efs infraestructureascode/aws-cli:latest aws s3 sync /efs s3://myefsbackup

So that way it would provide a easy way to backup the whole EFS. I could use this to backup all the EFS then upgrade the cluster, then restore the backup to the new cluster without risking losing any files in the EFS.

Could someone shed a light please?

Regards,
Guimo