Consistently out of disk space in docker beta

Same here also. With just 1 image of ubuntu only. qcow is 56Gb an keeps growing, Logs are 38Gb

-rw-r–r-- 1 emineroglu staff 56862834688 May 27 13:44 Docker.qcow2

rw-r–r-- 1 emineroglu staff 0 May 23 07:52 wtmp
-rw-r–r-- 1 emineroglu staff 6016576 May 27 09:20 messages.0
-rw-r–r-- 1 emineroglu staff 4710879010 May 27 13:43 vsudd.log
-rw-r–r-- 1 emineroglu staff 17894331 May 27 13:43 proxy-vsockd.log
-rw-r–r-- 1 emineroglu staff 20035304835 May 27 13:43 dmesg
-rw-r–r-- 1 emineroglu staff 90835269 May 27 13:43 acpid.log
-rw------- 1 emineroglu staff 15745816615 May 27 13:43 docker.log
-rw-r–r-- 1 emineroglu staff 72449 May 27 13:45 messages

Same for me. Seems like it’s related to the latest 1.11.1-beta13 as I didn’t have such problem for few weeks now I have been testing the native docker for Mac.
MacBook-Pro:com.docker.driver.amd64-linux nzelenkov$ ls -lh log/docker.log
-rw------- 1 nzelenkov staff 43G May 27 13:10 log/docker.log
MacBook-Pro:com.docker.driver.amd64-linux nzelenkov$ ls -lh Docker.qcow2
-rw-r–r-- 1 nzelenkov staff 73G May 27 13:21 Docker.qcow2

Same issue… both the size of the logs file spins out of control and the .qcow2 file.

Is there a way to roll back to the beta prior to beta13? I only began experiencing the issue since the last update.

+1. Started with the latest Beta 13. Would be great if we can rollback to 12 until the issue is fixed.

+1. Confirming what most other folks said, recent beta update seems to have caused this. Running on OS X 10.11.4. My system monitor indicates that docker is writing at about 43 MB/s pretty much the whole time it’s running.

@aleveille You have mutated this ticket, the original issue was about the size of the image that contained the docker layers being too small and not growing to the size of the physical disk. The issues you’ve directed here are about excessive logging.

Same here. Docker.qcow2 is now 132 GB and docker.log is 45 GB. Wasn’t having this problem prior to this update.

Oh darn, my bad really. Thanks for letting me know. I had my eyes on michieldemey answers and got confused it seems.

Same issue. more than 100G stolen by logs and Docker.qcow2. Relaunch does not help - logs and qcow2 grows 100M per second. Problem appears after last beta update.

Same here :

I 've got almost 400 Go of logs :hushed: :

~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/log

-rw-r–r-- 1 azize staff 70M 26 mai 23:54 acpid.log
-rw-r–r-- 1 azize staff 5,6G 26 mai 23:55 dmesg
-rw------- 1 azize staff 103G 27 mai 00:08 docker.log
-rw-r–r-- 1 azize staff 75K 27 mai 00:08 messages
-rw-r–r-- 1 azize staff 804K 26 mai 14:15 messages.0
-rw-r–r-- 1 azize staff 187K 26 mai 14:30 proxy-vsockd.log
-rw-r–r-- 1 azize staff 244G 26 mai 14:30 vsudd.log
-rw-r–r-- 1 azize staff 0B 26 avr 14:14 wtmp

Information

OS X: version 10.11.5 (build: 15F34)
Docker.app: version v1.11.1-beta13
Running diagnostic tests:
[OK] Moby booted
[OK] driver.amd64-linux
[OK] vmnetd
[OK] osxfs
[OK] db
[OK] slirp
[OK] menubar
[OK] environment
[OK] Docker
[OK] VT-x

Same here.

The Version:

Works-MacBook-Pro-4:/ soup$ docker version
Client:
Version: 1.11.1
API version: 1.23
Go version: go1.5.4
Git commit: 5604cbe
Built: Wed Apr 27 00:34:20 2016
OS/Arch: darwin/amd64

Server:
Version: 1.11.1
API version: 1.23
Go version: go1.5.4
Git commit: 8b63c77
Built: Mon May 23 20:50:37 2016
OS/Arch: linux/amd64

The files:
Works-MacBook-Pro-4:/ soup$ ls -lah ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/log/
total 721702816
drwxr-xr-x 10 soup staff 340B May 26 11:11 .
drwxr-xr-x 11 soup staff 374B May 27 15:13 …
-rw-r–r-- 1 soup staff 946M May 27 15:13 acpid.log
-rw-r–r-- 1 soup staff 120G May 27 15:13 dmesg
-rw------- 1 soup staff 181G May 27 15:16 docker.log
-rw-r–r-- 1 soup staff 184K May 27 15:22 messages
-rw-r–r-- 1 soup staff 1.1M May 26 14:11 messages.0
-rw-r–r-- 1 soup staff 1.4M May 27 15:13 proxy-vsockd.log
-rw-r–r-- 1 soup staff 42G May 27 15:13 vsudd.log
-rw-r–r-- 1 soup staff 0B May 11 13:39 wtmp

Works-MacBook-Pro-4:/ soup$ ls -lah ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/
total 130324392
drwxr-xr-x 11 soup staff 374B May 27 15:13 .
drwxr-xr-x 18 soup staff 612B May 27 15:13 …
-rw-r–r-- 1 soup staff 62G May 27 15:17 Docker.qcow2
-rw-r–r-- 1 soup staff 64K May 27 15:13 console-ring
-rw-r–r-- 1 soup staff 3B May 27 15:13 hypervisor.pid
-rw-r–r-- 1 soup staff 0B May 11 13:39 lock
drwxr-xr-x 10 soup staff 340B May 26 11:11 log
-rw-r–r-- 1 soup staff 17B May 27 15:13 mac.0
-rw-r–r-- 1 soup staff 36B May 11 13:39 nic1.uuid
-rw-r–r-- 1 soup staff 3B May 27 15:13 pid
lrwxr-xr-x 1 soup staff 12B May 27 15:13 tty -> /dev/ttys000

If you came here about your log files filling the hard-disk, a problem specific to Docker for Mac beta 13, please refer to this thread instead:

1.11.1-beta13 here. Not seeing the log file problem, but still seeing the Docker.qcow2 problem.

The Docker.qcow2 file almost instantly grows to 60 GB when pulling a single image (with a fresh, new, completely blank system).

Ouch, I haven’t even created or deployed a container and this ran away with all of my free disk space within about a week. Steps to reproduce:

  1. Install Docker for Mac Beta Version 1.11.1-beta13 (build: 7975)
  2. Provide an administrative password to complete setup.
  3. Wonder why the fan is constantly running until the system runs out of space.

The major offenders appear to be the following files:

60G ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.qcow2
70G ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/log/dmesg
31G ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/log/docker.log

I assumed the fan was related to corporate antivirus or my VMware virtual machines and didn’t even think to check the Docker beta I’d just installed. The moment I ran out of disk space and VMware notified me my virtual machines were stopping I rushed to track down the culprit.

Hi @bbeaudoin,

Docker Mac Beta Version 1.11.1-beta13.1 (build: 8193) fixes the log size issue.

HTH,
Alexandre

I’m not terribly thrilled about the qcow2 file size either given I don’t have any running containers. Does this version fix that as well?

Mine is stable at 1.1GB (empty, no containers) and then gradually increases as I add containers.

Thank you, I’ve blown away the qcow2 archive and restarted, got a watch on the directory so I can see if this explodes again. Presently at 700M, much better than 60G.

Update: Holding steady at 1.1G (1154678784 bytes) just as expected :grinning:

Glad to see it’s not just me!

proxy-vsockd.log = 48.89 GB
Docker.qcow2 = 25.82 GB

This is without even using Docker in a week after I had deleted these files once before. What is going on?!

I was wondering if there were any updates on this. Right now my only fix to recover the free space is to wipe the Docker.qcow2 and let the app re-create it.

3 question i have is:

1 Is there a way to specify a size for the qcow file to be larger then 64G. That’s too small for my use cases and usually run out of space after a few weeks.
2 My previous pattern to recover free space was using this script:

#!/usr/bin/env bash # remove untagged images docker stop $(docker ps -aq) docker rm $(docker ps -aq) docker rmi $(docker images --filter dangling=true -q) # remove unused volumes # remove stopped + exited containers, I skip Exit 0 as I have old scripts using data containers. docker rm -v $(docker ps -a | grep "Exit [1-255]" | awk '{ print $1 }')
Is there a better way of doing this or is delete the image file and resetting the only valid solution atm?

3 Is there a way to ssh into the VM? that’s running the linux kernel? Or is that abstracted away ?