This post is about an issue that I am having using webpack and Vue.js and the stable version of Docker for Windows.
It seems that large compiled files ( more than 1 Mb ) are splitted into chunks when sent over to the Virtual Machine, and it happens most of the time that they end up corrupted ( parts of other files seems to get injected in between files chunks ).
I came across a few reported bugs that are happening on both Windows and MacOS, it seems that their similar problem gets solved by changing Apache configuration to
sendfile Off AND/OR
MMAP Off inside the Dockerfile :
RUN sed -i -e 's/EnableSendfile On\/EnableSendfile Off/g' /etc/apache2/httpd.conf
RUN sed -i -e 's/EnableMMAP On\/EnableMMAP Off/g' /etc/apache2/httpd.conf
This did not fix anything for me…
As you can see, without any code change, after each hard-refresh, these are the errors that come up, every time at a different line, because of wrongly merged files chunks :
Does anyone in the community have an idea of why this keeps on happening ?
Came in search of a solution to this as well. Did you ever find one? Having the exact same issue you are, but for large images, PDFs, etc…
Until now, it was a minor annoyance - visual glitches and whatnot. But like you, it’s started affecting compiled JS code, and is breaking our DEV environment. Would love a solution!
Hi, I’ve never been able to fix 100% of the file corruptions, but there is a little hack that may help you : I ended writing a build.html file which would only query the large JS files, I would then open this fle in a new tab and hard refresh until no syntax error shows in the console, then a classic refresh ( not hard refresh ) on the actual page you’re working on would result fine.
Trying to hunt down a solution for this now… yeah, neither MMAP Off nor EnableSendFile Off seem to fix it for me. It is, however, the same portion of the same files that get distorted for me every time… I confirmed this in a hex compare of the file downloaded (which also strangely was always the same byte size…)
On the left, the original file, on the right, the downloaded one. Only .000001% of the file changes, but it breaks it.
Correct me if i am wrong, but that still doesn’t reduce the size of the compiled js files right ?
I see that this topic is unanswered, as well as the related topic on SO.
Does anybody knows a solution for that problem? It’s very annoying and related with heavy files. Adding the sendFile and MMAP option to apache doesn’t change anything.
I had a similar issue downloading static files in docker container from a volume mapped to a folder that’s mounted on its Linux host. i.e. a samba network drive mounted on Linux host, docker container running inside the Linux host.
Downloading a text log file (using wget from a different machine or browser) may appear around 1.5MB contains around 20 times corrupted bytes (some missing bytes, some additional bytes from the same file, some may just be binary corrupt bytes). File size remain the same. Repeat download the same file and the corrupted bytes are different.
wget from the linux host seem getting the file identical to the original file.
I’m using the php:7.0-apache image which seems to have no EnableSendFile On/Off settings in any conf file.
Turning on the EnableSendFile in a conf solved the issue. Files downloaded binary identical to the original file from the samba network drive.
I don’t know of why this solved the issue. Hope these information helps.