External connection from Worker Nodes

Hi!
I have Docker Swarm cluster on one Manager and two Workernodes.
Its all in one subnet.
I have service with 2 replicas. Application in container trying to connect to external db cluster.
But it can connect only from container on 2nd Workernode, and can`t from 1st.

External db cluster IP 10.1.1.125

[root@SwarmNode2 ~]# docker exec 2b86571b5a10 traceroute 10.1.1.125
traceroute to 10.1.1.125 (10.1.1.125), 30 hops max, 46 byte packets
 1  172.18.0.1 (172.18.0.1)  0.011 ms  0.020 ms  0.009 ms
 2  10.1.1.125 (10.1.1.125)  0.887 ms !A  0.622 ms !A  0.542 ms !A

and

[root@SwarmNode1 ~]# docker exec -it 41ea1fbcc636  traceroute 10.1.1.125
traceroute to 10.1.1.125 (10.1.1.125), 30 hops max, 46 byte packets
 1  172.18.0.1 (172.18.0.1)  0.012 ms  0.008 ms  0.014 ms
 2  *  *  *
 3  *  *  *
 4  *  *  *

What i missed?
Nodes have the same network settings and OS.
Thanks!

I don’t have the exact solution, but I am commenting in the hope that it will be helpful.

Does the connection from the host have the same result?

[root@SwarmNode2 ~]# traceroute 10.1.1.125
[root@SwarmNode1 ~]# traceroute 10.1.1.125

I think it is controlled by an access list or something like that, since it has ‘!A’ on it!

2  10.1.1.125 (10.1.1.125)  0.887 ms !A

Problem found!
Firewalld daemon was down. lol. After starting the service, everything works correctly.

2 Likes