I’m running Apache Airflow in Docker using the official docker-compose
setup, and I wanted to use the DockerOperator
to launch additional containers using the host’s Docker daemon.
To do this, I:
- Mounted the Docker socket:
/var/run/docker.sock
into the Airflow containers. - Added the
airflow
user to the host’sdocker
group (GID 120) via the Dockerfile.
Despite this setup, I’m running into a Permission Denied error when Airflow tries to access the Docker socket using the DockerOperator
.
Host Configuration
$ ls -l /var/run/docker.sock
srw-rw---- 1 root docker 0 Jul 14 13:49 /var/run/docker.sock
$ getent group docker
docker:x:120:ubuntu
Dockerfile Setup
FROM apache/airflow:2.10.0-python3.11
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
USER root
RUN groupadd -g 120 docker && usermod -aG docker airflow
USER airflow
What’s Happening
Inside the container:
$ id airflow
uid=50000(airflow) gid=0(root) groups=0(root)
Even though getent group docker lists the airflow user, it seems the group is not actually applied during runtime.
This leads to the error when running a task:
Failed to establish connection to Docker host unix://var/run/docker.sock:
Error while fetching server API version:
('Connection aborted.', PermissionError(13, 'Permission denied'))
Workaround That Works (But is Unsafe)
If I run:
chmod 666 /var/run/docker.sock
Then everything works — but clearly this is insecure and not suitable for production.
The Question
How can I securely access the Docker socket of the host from inside the Airflow container without running as root or giving overly permissive access (chmod 666
)?
If anyone has a clean, secure pattern for enabling this — especially in a docker-compose
Airflow setup — I’d really appreciate it!
Thanks in advance!