Failed to start sshd in container: “Error: Too many open files"

Hello

I’d like to know the cause and corrective action of the following “Too many open files” issue.
Or please tell me how to examination the issue.

  1. Issue
    In the following conditions, fail to systemctl command in the docker container.
    1)launch docker containers a certain number (20-25) or more. and
    2)sshd service is activated at startup in each container. and
    3)It is a container launched beyond a certain number.
    message was below

$ docker exec -it hayashi.naru-work24 /bin/bash
[root@35a013c4b396 /]# systemctl start sshd.service
Error: Too many open files
Job for sshd.service canceled.
There is no problem with containers of a certain number or less

  1. Environment
  • host os

# uname -a
Linux gpu07 3.13.0-96-generic #143-Ubuntu SMP Mon Aug 29 20:15:20 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux

  • guest os image
    nvidia/cuda:8.0-cudnn5-devel-centos7

  • docker version

docker -v
Docker version 1.12.6, build 78d1802
1.12.1, build 23cf638 also occurs
# docker info
Containers: 367
Running: 24
Paused: 0
Stopped: 343
Images: 1122
Server Version: 1.12.6
Storage Driver: aufs
Root Dir: /data/docker/aufs
Backing Filesystem: extfs
Dirs: 1827
Dirperm1 Supported: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: nvidia-docker local
Network: bridge null host overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Security Options: apparmor
Kernel Version: 3.13.0-96-generic
Operating System: Ubuntu 14.04 LTS
OSType: linux
Architecture: x86_64
CPUs: 48
Total Memory: 503.8 GiB
Name: gpu07
ID: RA7P:IZSJ:RLKJ:Y6J3:MO55:MMG6:PT4G:UZNB:PUQS:VVX3:VBMS:Z7OU
Docker Root Dir: /data/docker
Debug Mode (client): false
Debug Mode (server): false
Http Proxy: http://…
Https Proxy: http://…
Registry: https://index.docker.io/v1/
WARNING: No swap limit support
Insecure Registries:
127.0.0.0/8

  1. My investigation
    “Too many open files” is mapped errno 24.
    So I guess some system call return errno 24.
    I checked the following file and directory.
  • /proc/{pid}/limits
  • /proc/{pid}/fd

When the issue occurred, there was no process opening a large number of files compared with limits number.
I can’t find which process return errno 24.

We are using the nvidia-docker plugin. But it does not seem to be related.
Failed to start sshd in container: “Error: Too many open files" · Issue #313 · NVIDIA/nvidia-docker · GitHub

Sorry for my bad English
thanks

Have you solved the above problem ?

I have the same problem.

Hi, I have in the file: /lib/systemd/system/docker.service

I have this line:

LimitNOFILE=1048576

Ubuntu 16.04 and docker-CE 17.09

Can you check if you have this option, if not, can you test with it, but you will need to restart the daemon.

Also can check the /etc/sysctl.conf if this doesn’t work

Regards