Docker Community Forums

Share and learn in the Docker community.

Issues after upgrading docker desktop to

getting errors when starting my containers, seems like files aren’t able to be accessed in multiple containers
using linux containers on windows
using docker-compose
context: .
dockerfile: conf/docker/Dockerfile.frontend
- .env
command: yarn watch-js
- .:/usr/src/app
# ensure node_modules aren’t shared with host system
- /usr/src/app/node_modules
- network1

api_1 | time=“2020-03-14T14:20:29Z” level=debug msg=“Loading cert into cert pool: ssl.crt”
api_1 | time=“2020-03-14T14:20:29Z” level=warning msg=“Error configuring submit connection” error=“Failed to load the keyfile when configuring the Submit client: open ssl.crt: operation not permitted”

js_1 | fatal: failed to read object e6b8c916fa38ac66e0fa5a0cb41b65b047adba91: Operation not permitted
js_1 | /usr/src/app/node_modules/webpack-cli/bin/cli.js:93
js_1 | throw err;
js_1 | ^
js_1 |
js_1 | Error: Command failed: git describe --tags --always
js_1 | fatal: failed to read object e6b8c916fa38ac66e0fa5a0cb41b65b047adba91: Operation not permitted
js_1 |
js_1 | at checkExecSyncError (child_process.js:629:11)
js_1 | at execSync (child_process.js:666:13)


I am having a similar issue. I think it may be due to the maximum path length. Short paths seem to work but long ones like the ones in the git folder seem to have an issue.

I confirmed that downgrading back to restores the functionality. So it’s definitely something to do with the per the release notes they did something with the caching and long file names

  • Fixed cache invalidation and event injection in shared volumes with host paths longer than 260 characters.
1 Like

In my case, I’m observing an odd behavior in my PHP containers:
I cannot include() any PHP files unless they are writable in Windows.

For example, if I see this error when testing something in my local LAMP stack:

**Warning** : require(/home/flimflam/www/php/links.php): failed to open stream: Operation not permitted in **/home/flimflam/www/php/fakeframe.php** on line **58**

… If I check out the file ‘links.php’ in Perforce (on the Windows machine running Docker Desktop), which marks it writable, then the ‘links.php’ file is properly included and the next non-writable file will fail to be included.

I will next explore downgrading to to confirm it resolves the issue.

EDIT: Confirmed that merely downgrading to resolves the problem.

correct, i returned to previous docker desktop which changes docker engine back to 19.03.5 and everything is working again
EDIT: i did not check docker engine separately so could be a problem there or with docker desktop

i haven’t been able to find a fix or workaround, how do we get support from docker?

not sure what you mean, seems like all my files are writeable

my node modules folder doesn’t exist in windows, it’s created in the container

files seems to be not synchronized between host and docker containers Docker for windows. Switching to helps

I have same problem after upgrade to

git diff --quiet HEAD
fatal: failed to read object 18dfe6c8059328fb640f1a4e440de49cc7bc970d: Operation not permitted

After downgrade to there is no error - everything is working fine.

Same issue here. Also since

Same for me, did anyone found the workaround?
fatal: failed to read object a59d90adb347658319feb1619fdfb41d4da3d74f: Operation not permitted

the only workaround so far is to downgrade, i don’t know how to raise this as an issue to the docker team as i don’t think there is official support channel

It’s already in issues and docker team fixing it

same problem here, finally found this one and i am waiting for the fix…

I have a different issue:
docker rm -f $(docker ps -aq)
unknown shorthand flag: ‘a’ in -aq)
See ‘docker rm --help’.

Reverting to resolved. Is this just me or a real issue?

Of course, YMMV and perhaps you’re seeing a different problem altogether.

On my system, the files on the host are located under a perforce client mapping for source control. When using perforce, if you have a file submitted to the SCC, perforce marks the file read-only. As most of the PHP code for my website is not checked out from source control for edits, most of the files are read-only in the Windows file system.

I map the perforce client directory, which contains my website as well as its PHP code, into my PHP container, unchanged. Therefore, the files are mostly read-only.

After upgrading to DD, I noticed that when testing my LAMP stack locally, I’d get errors in PHP require/include operations as listed above.

The weird thing is that checking them out for edit in perforce - on the host - which marks the files as writeable on the host - the operation “fixes” the include/require problems in the PHP running in the container - without rebuilding / restarting the container.

My conclusion is that since the ONLY thing that changed w.r.t. the files is the read/write access on the host file system, there must be some issue with how file access to the mapped directories is working in

Hi everyone, just chiming in to say that intvsteve’s solution worked for me. Thanks!

For anyone else who is experiencing this issue, this appears to be the GH ticket for it:

1 Like