Making service connect and stay up through docker network

I get that conceptually, but I don’t see a clear path… I have found several ways to do this, but the issue is that i don’t know which one to pick.

Methods to Keep host running, really seem to fall into 2 categories:

  1. call job that will not stop on ubuntu container: description here the container just keeps running because a program was called that is persistant

  2. call job that will not stop on ubuntu container: example A write a useless log
    CMD tail -f /dev/null

  3. service in a service: description here the actual service could have some kind of http listener installed on it and the ports are exposed so that the http_listener on the LegacyApp connects to the nodeweb app. To do this I would need to install do one of the following:

  4. service in a service:(example A): install node express onto ubuntu container and configure to listen on port then pass commands using child process or something similar

  5. service in a service: (example B):create a microservice that boots within the ubuntu service within the cpp (ie cpprestsdk) and configure http_listener on ip/tcp port

None of these seem very clean, or logical. I am sure there are others that have a similar config. Was thinking it could do something to automatically make one service connect to the other when it starts using the docker compose network it
(somehow in docker compose make this relationship)

  1. nodeweb starts and is listening on port and internally @ https://nodeweb:6000
  2. LegApp starts, compiles, and connects to nodeweb as client waiting for commands exposing http://LegApp:4000 for private connection. Automatically porting to running process on service CMD:/bin/bash