I have a docker service for a non-web app running. I have scaled it to 5 containers on my host. I want to balance the load coming in for the requests for the non-web service. It doesnt have endpoint API or a REST call through which it works. Its a server exposed at 2321,2322 ports. It gets requests from other similar service via shell commands. How should i load balance my service. I tried using HAProxy, but the incoming requests are at port 80 and http protocol is used. Is there a way?
The load balancing in docker swarm is done through like an L3 firewall so it just routes to whatever container on whichever node in a round robin fashion. If you use k8s they have another facility but it is similar.
I am currently using only docker-compose and doing this on a single host. My containerized application runs on ports 2321 and 2322. It accepts requests via shell commands. How should i spread the load across 10 containers.
Docker-compose does not have that capability.