Cannot list files of dir with large number of files

Docker running on Windows. Have a dir mounted as a volume

docker run --rm -it  --net="host" --user 1000:1000 --volume f:/vision:/vision 

f:/vision/sub/inside has around 95000 files.
ls Causes this error ls: reading directory '.': Input/output error

Trying to read all contents of the dir programmatically from PHP also fails.
However, individual files can be read and written to, both from the PHP app and manually vi /visionsub/dir/README.txt
Its only listing all files of dir that fail.

Total size of files: 56MB. On SSD. Bumped Ram from 2 to 4GB. CPU from 1 to 4. Have 5+ GB free space on host. Fast PC. Win 10 LTSC . Docker Updated
Edit: DOCKERFILE FROM php:7.3-cli
Any ideas ?

Thanks!

ls normally reads the entire list in so it can sort and then displays them. ls -f might work as it just reads and writes. I think find also processes directories as it reads them in so maybe find . -type f -ls will work.

having that many files in a dir is very inefficient. Create a subdir tree (perhaps using hashes) so that you limit each dir to a few hundred entries (files or dirs). Even a 2-layer tree with 500 entries can hold 250000 files

Unfortunately changing dir structure isn’t my decision. ls -f has the same result.

Could running docker with WSL2 help ? Windows LTSC doesn’t support wsl2, so I can only test after reinstalling windows.

I’ve replicated this issue both under Windows and MacOS. In my case there are about 490k files.

I agree it might be inefficient to have so many files in a folder (although performing an ls in a MacOS terminal directly returns results in a few seconds, so the inefficiency is not horrible), but on the other hand this error also clearly is a bug in Docker Desktop.

In my case I can work around it, as I own the code that writes the files, but it will cost me several days, while it actually should just work. Running the code in a Docker container in a K8s cluster also just works as expected.