Docker Community Forums

Share and learn in the Docker community.

"shim-log.json: no such file or directory" How to diagnose root cause?

I’m new to docker and have a system running Ubuntu 16.04 with the following docker version info:

Client:
 Version:           18.06.1-ce
 API version:       1.27 (downgraded from 1.38)
 Go version:        go1.10.4
 Git commit:        e68fc7a
 Built:             Fri Jan 25 14:33:54 2019
 OS/Arch:           linux/amd64
 Experimental:      false

Server:
 Engine:
  Version:          17.03.2-ce
  API version:      1.27 (minimum version 1.12)
  Go version:       go1.6.2
  Git commit:       f5ec1e2
  Built:            Thu Jul  5 23:07:48 2018
  OS/Arch:          linux/amd64
  Experimental:     false

After a few deployments of new docker images, I start seeing the following error:

root@docker-int01:/# docker start dbutils-clean_files
Error response from daemon: open /var/run/docker/libcontainerd/containerd/283069b198e046105903f32bd846da4af3617aa152c5ff5a8e5cbe3371dcf903/init/shim-log.json: no such file or directory
Error: failed to start containers: dbutils-clean_files

The syslog has the following:

Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.52856161-07:00" level=error msg="containerd: start container" error="open /var/run/docker/libcontainerd/containerd/283069b198e046105903f32bd846da4af3617aa152c5ff5a8e5cbe3371dcf903/init/shim-log.json: no such file or directory" id=283069b198e046105903f32bd846da4af3617aa152c5ff5a8e5cbe3371dcf903
Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.530140334-07:00" level=error msg="stream copy error: reading from a closed fifo\ngithub.com/docker/docker/vendor/github.com/tonistiigi/fifo.(*fifo).Read\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/vendor/github.com/tonistiigi/fifo/fifo.go:142\nbufio.(*Reader).fill\n\t/usr/lib/go-1.6/src/bufio/bufio.go:97\nbufio.(*Reader).WriteTo\n\t/usr/lib/go-1.6/src/bufio/bufio.go:471\nio.copyBuffer\n\t/usr/lib/go-1.6/src/io/io.go:370\nio.Copy\n\t/usr/lib/go-1.6/src/io/io.go:350\ngithub.com/docker/docker/pkg/pools.Copy\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/pkg/pools/pools.go:60\ngithub.com/docker/docker/container/stream.(*Config).CopyToPipe.func1.1\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/container/stream/streams.go:119\nruntime.goexit\n\t/usr/lib/go-1.6/src/runtime/asm_amd64.s:1998"
Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.531638394-07:00" level=error msg="stream copy error: reading from a closed fifo\ngithub.com/docker/docker/vendor/github.com/tonistiigi/fifo.(*fifo).Read\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/vendor/github.com/tonistiigi/fifo/fifo.go:142\nbufio.(*Reader).fill\n\t/usr/lib/go-1.6/src/bufio/bufio.go:97\nbufio.(*Reader).WriteTo\n\t/usr/lib/go-1.6/src/bufio/bufio.go:471\nio.copyBuffer\n\t/usr/lib/go-1.6/src/io/io.go:370\nio.Copy\n\t/usr/lib/go-1.6/src/io/io.go:350\ngithub.com/docker/docker/pkg/pools.Copy\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/pkg/pools/pools.go:60\ngithub.com/docker/docker/container/stream.(*Config).CopyToPipe.func1.1\n\t/build/docker.io-_kdaEJ/docker.io-17.03.2/.gopath/src/github.com/docker/docker/container/stream/streams.go:119\nruntime.goexit\n\t/usr/lib/go-1.6/src/runtime/asm_amd64.s:1998"
Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.531921802-07:00" level=error msg="Create container failed with error: open /var/run/docker/libcontainerd/containerd/283069b198e046105903f32bd846da4af3617aa152c5ff5a8e5cbe3371dcf903/init/shim-log.json: no such file or directory"
Feb 14 16:55:27 docker-int01 kernel: [704836.980340] docker0: port 1(veth5386353) entered disabled state
Feb 14 16:55:27 docker-int01 kernel: [704836.983826] device veth5386353 left promiscuous mode
Feb 14 16:55:27 docker-int01 kernel: [704836.983829] docker0: port 1(veth5386353) entered disabled state
Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.723485623-07:00" level=warning msg="Couldn't run auplink before unmount /var/lib/docker/aufs/mnt/4f9d6c8fdfaf3096776e922d3a4387faaf2da50ada6ce9177bbc17a233b1e886: exec: \"auplink\": executable file not found in $PATH"
Feb 14 16:55:27 docker-int01 dockerd[1208]: time="2019-02-14T16:55:27.752753891-07:00" level=error msg="Handler for POST /v1.27/containers/dbutils-clean_files/start returned error: open /var/run/docker/libcontainerd/containerd/283069b198e046105903f32bd846da4af3617aa152c5ff5a8e5cbe3371dcf903/init/shim-log.json: no such file or directory"

I can’t seem to find any previous issues that help narrow down a fix.

Has anyone seen this before and know what the fix/workaround is?

Thanks!

I have the same problem… Did you find the root cause for this?

Hi ,

Could you please restart the docker deamon .

systemctl restart docker and let me know if you are still facing the same issue

I just had this issue when trying to run any docker image. Not sure what my Ubuntu installation has done, but both docker and docker.io were installed. I simply used aptitude to uninstall both, and then reinstall docker.io.

You can verify the fact that your server has two docker packages installed by using the following command:

sudo apt list --installed | grep docker

This should only list one package (my output had two packages installed).

# Remove both packages
sudo apt-get remove docker docker.io

# Reinstall docker.io package
sudo apt-get install docker.io

This fixes the issue for me, and the apt list --installed output now only lists docker.io:

docker.io/xenial-updates,xenial-security,now 18.06.1-0ubuntu1.2~16.04.1 amd64 [installed]

Restarting docker always seems to fix it (until it happens again), but I still don’t know why it happens.

I checked my nodes and I definitely had both packages. I’ve done as suggested and remove all and reinstalled ‘docker.io’. I’ll report back when I know whether this fixed the issue or not.

Same problem here. And I cannot afford to restart docker daemon… waiting for a solution.