Docker Community Forums

Share and learn in the Docker community.

Cannot list files of dir with large number of files

Docker running on Windows. Have a dir mounted as a volume

docker run --rm -it  --net="host" --user 1000:1000 --volume f:/vision:/vision 

f:/vision/sub/inside has around 95000 files.
ls Causes this error ls: reading directory '.': Input/output error

Trying to read all contents of the dir programmatically from PHP also fails.
However, individual files can be read and written to, both from the PHP app and manually vi /visionsub/dir/README.txt
Its only listing all files of dir that fail.

Total size of files: 56MB. On SSD. Bumped Ram from 2 to 4GB. CPU from 1 to 4. Have 5+ GB free space on host. Fast PC. Win 10 LTSC . Docker Updated
Edit: DOCKERFILE FROM php:7.3-cli
Any ideas ?

Thanks!

ls normally reads the entire list in so it can sort and then displays them. ls -f might work as it just reads and writes. I think find also processes directories as it reads them in so maybe find . -type f -ls will work.

having that many files in a dir is very inefficient. Create a subdir tree (perhaps using hashes) so that you limit each dir to a few hundred entries (files or dirs). Even a 2-layer tree with 500 entries can hold 250000 files

Unfortunately changing dir structure isn’t my decision. ls -f has the same result.