The tasks of the worker node do not work randomly

Hi all

I have the cluster in docker swarm with load balance and until a few days ago, it worked correctly. But 1 day ago when I restart the manager node and worker node, all the replicated tasks are executed in the manager and not in the worker.

In the worker with the docker ps -a command shows me the tasks with status “exited”. But they do not get up again. They just get up in the manager

What is this about?. I need help, urgently.

Screenshot%20from%202019-05-14%2020-39-04

Screenshot%20from%202019-05-14%2020-39-18

Screenshot%20from%202019-05-14%2020-41-25

Screenshot%20from%202019-05-14%2020-41-31

Thank you

hey
so did you check that worker node is still the part of swarm?

yes.
typing node docker ls shows me that raspblanrok-slave is active in the swarm.

I noticed that when the worker starts after a restart, the service appears in the loaded state. and I have to lift it.
I have typed sudo systemctl enable docker and it is already active. But it is still happening that the tasks do not return to the worker node, if not, they remain in the manager, as they appear in the screenshots above.