Docker swarm only executing requests on master

Using Amazon ec2 I have created a three node Docker Swarm.

I then created a service and scaled it to 8 instances

So now there are
3 instances master node
3 instances worker node 1
2 instances worker node 2

Abbreviated list:

ID NAME IMAGE NODE DESIRED STATE CURRENT STATE
etpmx dev.1 aw2:dev ip-1219 Running Running 18 hours ago
dliho dev.2 aw2:dev ip-143 Running Running 17 hours ago
8rq1v dev.3 aw2:dev ip-17 Running Running 17 hours ago
d8dyi dev.4 aw2:dev ip-17 Running Running 17 hours ago
8cc1k dev.5 aw2:dev ip-143 Running Running 17 hours ago
cwiuv dev.6 aw2:dev ip-1219 Running Running 17 hours ago
2rhln dev.7 aw2:dev ip-1219 Running Running 17 hours ago
epxqt dev.8 aw2:dev ip-17 Running Running 17 hours ago

Manually sending requests, or using an external load testing tool, the requests are distributed in a round-robin fashion scross the node the requests are sent to.

For example sending multiple requests to the master node’s IP address, the requests will get served by the three instances on the master node.

Sending requests to Node 1’s IP address, the requests all get serviced by the three instances of the service on node 1.

Same with node 2.

Am I misunderstanding something? I thought the purpose of the swarm was to distribute requests across all nodes.