Greetings,
It’s recently come to attention that java/node processes running inside containers are not letting go of deleted log files.
That is, is you run lsof | grep -i 'deleted'
it shows the log files still held open by a process inside a container.
This is happening for both java and nodejs processes. These respectively use logback and winston for logging and logrotation. Both these solutions perform their own logrotation, no external program is used.
This stood out as used disk space did not match reported disk usage upon inspection.
Since then this has been tested outside of the container and it is not occuring.
The log files are created in a directory inside the container, which is also mounted so they can be seen outside the container.
Restarting the container release these files, though it shouldn’t happen in the first place of course.
Googling for this only gives results for questions regarding the logs of docker itself.
Can anyone shed some light on this? What could be causing these files to not be let go?
Observed on:
Ubuntu 16/18/20
Docker 18/19/20
All installs use overlay2