Unable to find existing allocation for shared memory segment to unmap

My computer has recently updated to macOS Sonoma 14.3.1, and since then I have not been able to visualize my openFoam trials by using the command “paraFoam”. I began with using docker version 4.27.0 and thought that it might be an issue with that version, as per discussed here: Docker for Macos Sonoma 14.3.1 · Issue #7183 · docker/for-mac · GitHub . In an attempt to resolve the issue, I reverted back to docker version 4.26.1. I am still having the same issue, which will be pasted below:

5edb65479128: gFHRcore_simplifiedMeshv4>> paraFoam
Created temporary ‘gFHRcore_simplifiedMeshv4.OpenFOAM’
QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to ‘/tmp/runtime-openfoam’
assertion failed [rem_idx != -1]: Unable to find existing allocation for shared memory segment to unmap
(VMAllocationTracker.cpp:745 remove_shared_mem)
Trace/breakpoint trap

I have not been able to find that many resources online to help with this issue and really need some guidance in solving it.

3 Likes

Are you able to fix it? I am facing the exact same issue now

No resolution here. What version of MacOS and Docker Desktop are you running?

Im running on Sonoma 14.3.1 as well, I have tried to switch to multiple version of Docker Desktop (4.21, 4.26, 4.27), and all is facing the same issue. Not sure if it is due to my M3 chip too.

I am facing the same issue
Everything stopped to work after upgrade to Sonoma 14.3 few weeks ago
Also tried to downgrade Docker Desktop, did not help
Colleagues with Intel Mac does not have this problem

Same issue here, macOS 14.3.1, M1 chipset, Docker Desktop 4.27.2

Same issue here, also on 14.3.1 and M1 chipset :frowning:

Was anyone able to solve this, I am facing the same issue on macOS 14.3.1, M2 chip when I am trying to run GTKWave with X11 forwarding on XQuartz on an amd64 Ubuntu image. It used to work perfectly fine before the software upgrade

1 Like

I have a work around that works for me. Basically just have a dummy file titled “anything.foam” and have nothing in it, download paraview locally, and open the .foam case directly in paraview and it works for me.

where can I find the file and the paraview ?

I’m facing the same issue, does anyone know how to solve it?

Hello, I also have the same issue here ;

Apple M3 Max - 36Go ram
Macos Sonoma 14.4.1
Docker 4.29.0

docker run -it --name ifx -h ifx --privileged -p 9088:9088 -p 9089:9089 -p 27017:27017 -p 27018:27018 \ -p 27883:27883 -e LICENSE=accept icr.io/informix/informix-developer-database:latest

I get this output :

2024-04-12 15:15:27 [2024-04-12T13:15:27Z] >>>        WL cmd: java  -jar '/opt/ibm/informix'/bin/jsonListener.jar  -config /opt/ibm/informix/etc/wl_rest.properties  -config /opt/ibm/informix/etc/wl_mongo.properties  -config /opt/ibm/informix/etc/wl_mqtt.properties  -logfile /opt/ibm/informix/etc/json_listener_logging.log -loglevel info -start & 
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>    [COMPLETED]
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>    Execute init-startup scripts
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>    [COMPLETED]
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>    Informix is not online - Exit
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>     PIDS = 1241
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>    Clean up Java PID: 1241
2024-04-12 15:15:28 [2024-04-12T13:15:28Z] >>>     PIDS = 1241
2024-04-12 15:15:42 assertion failed [rem_idx != -1]: Unable to find existing allocation for shared memory segment to unmap
2024-04-12 15:15:42 (VMAllocationTracker.cpp:745 remove_shared_mem)
2024-04-12 15:15:42 [2024-04-12T13:15:42Z] >>>    INFORMIX_CLEAN param1: 
2024-04-12 15:15:42 argv[1]: (0x7fffffdcfe13) /opt/ibm/scripts/informix_entry.sh
2024-04-12 15:15:42 argv[2]: (0x7fffffdcfe36) /opt/ibm/scripts/informix_stop.sh
2024-04-12 15:15:42 Shutdown Signal Received 10:
2024-04-12 15:15:42  

Hey guys, I finally resolved the issue by increasing the virtual disk limit by about one bar and it fixed the problem for me

This solution of increasing virtual disk limit by one bar (increased it from 64 GB to 128 GB) did not fix the issue for me, was anyone else able to find any other fixes for this ?