Swarm port conflict with worker not being reported

I created a swarm with 3 nodes - host01 (master), host02(worker), host03(worker). I started a standalone-nginx container by publishing port 80 on the host02. I’m able to access the nginx welcome page by running these 3 commands on the host02

  1. curl localhost:80

  2. curl

  3. curl

Now I created a swarm service called nginx with 6 replicas and publishing the service to port 80 using the default swarm routing mesh. In this case the service should be accessible on port 80 of every node in the swarm. Since port 80 on host02 was already in use I was expecting it to give a warning or to schedule containers only on those nodes where port 80 was available.

However no warning was reported and also swarm containers were scheduled to run on host02 (which was already having port 80 published)

Here is some more information about each node

Now when I run the same 3 commands given above on host02, only curl localhost:80 request is routed to the standalone-nginx container. However curl | request is routed to the swarm containers.

My questions:

1.Why is it that swarm does not report port conflict on host02?

2.Why is it that curl localhost:80 is routed to standalone-nginx and the other 2 requests are routed to swarm containers?

3.Why is it that on host01 and host03 the port 80 is published by the process dockerd whereas on host01 the port is published by the process docker-proxy?

4.If docker-proxy is responsible for routing the traffic (from host IP:Port to container IP:port in case of standalone containers, how is the traffic on host02 being routed to swarm containers?

OS/Version: I’m using Katacoda lab environment - OS: Ubuntu 16.04
Docker Version: Docker version 18.09.7, build 2d0083d

Thank You