New to swarm - strange behaviors need explanation

I’m experimenting with swarm on my Ubuntu 16.04 laptop - the host, where I setup two nodes, one docker-host on digitalocean, the other vbox - boot2docker on virtualbox on the same laptop.

After starting the swarm manager on the host, and run the swarm join command on the above two nodes, a ‘docker ps’ on the laptop and in a ssh session of the digitalocean host shows the same container:

bfzhou@erdbeer:~$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
373c2f1b05d3 swarm “/swarm manage token:” 3 hours ago Up 20 minutes 0.0.0.0:32768->2375/tcp silly_noether
48848154c8b3 swarm “/swarm join --addr=1” 3 hours ago Up 20 minutes 2375/tcp focused_ramanujan

But on the virtualbox node, it shows only the swarm container itself:
docker@vbox1:~$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
5fdd5432cf77 swarm “/swarm join --addr=1” 3 hours ago Up 7 seconds 2375/tcp loving_euclid

I wonder why the difference, and why the swarm manager container shows up in the node on digitalocean at all. I didn’t start a swarm manager there.

On the host itself, if I point docker client to the swarm manager by exporting the $DOCKER_HOST pointing to the local host 127.0.0.1: 32768, docker command will stop working as it complains “bfzhou@erdbeer:~$ docker ps
Cannot connect to the Docker daemon. Is the docker daemon running on this host?”

It seems setting of $DOCKER_HOST causes both the swarm manager and the agent container to exit on the host, but why the latter? I didn’t start it on the host, rather on the node in the digitalocean cloud.

I wonder what I missed in the setting above?

Also, docker ps in another terminal session on the host won’t show the running processes. Why is this?

Thanks much for any insight!