I want to create a network of a container in which one central container should be able to ssh into all other containers. I know that it’s not advised to ssh from one container to another, and we can use volume for data sharing but in my production network, there is a script which ssh into some system and does something. So I want to simulate the same environment using a container so that I can test new feathers and add in a production network.
So whats the question?
Sure it can be done, by starting the sshd in the containers, is this purely ssh only containers or will the containers run some other service?
Purely ssh only containers. @terpz Can you please help me to do it . I ran to containers with ssh running on both. I am able to ssh from host to both the container but I am not able to ssh from one container to another. I tried so many different ways but I failed .
Docker file I am using is:
RUN apt-get update
RUN apt-get install -y netcat ssh iputils-ping
In the container I am running below commands
root@d0b0e44f7517:/# mkdir /var/run/sshd
root@d0b0e44f7517:/# chmod 0755 /var/run/sshd
root@d0b0e44f7517:/# useradd --create-home --shell /bin/bash --groups sudo u2
root@d0b0e44f7517:/# passwd u2
Enter new UNIX password:
Retype new UNIX password:
passwd: password updated successfully
I made two containers, both are same except one has user u1 and other has user u2 as shown above. After this, I tried to ssh from host to container using command ssh -X u2@localhost -p 32773(32773 is a port which is mapped to container’s 22 port). So ssh works from host to container but I am not able to ssh from one container to another container.
I have also this problem. In our team we have few server and one central server which ssh into all other server and monitor them. So whenever I want to do some changes, I have to test in on production network which is dangerous. I want to make network of container and one of them ssh into all other container and monitor them or make changes using ansible. For that ssh is mandatory. So @terpz waiting for your answer .
I will answer monday, im on a trip until then
But there is many ssh-only containers on docker hub, or is your requirements different?
I only want to ssh from one container to another and both the container should use Ubuntu base image. @tandeldipak had posted a dockerfile which build ubuntu image with has ssh & ip functionality can be consider to make container and that container should be able to ssh into one another. @terpz enjoy your trip will discuss on monday.
Every container is isolated, so you need to join them somehow, the old way was to use links but now, the way would be to create a shared network between the containers.
With this Dockerfile:
RUN apt-get update && \
apt-get install -y netcat ssh iputils-ping && \
mkdir /var/run/sshd && \
chmod 0755 /var/run/sshd && \
useradd -p $(openssl passwd -1 u2password) --create-home --shell /bin/bash --groups sudo u2
CMD ["/usr/sbin/sshd", “-D”]
And this docker-compose.yml:
You can run this command, if you have docker-compose installed, else: https://docs.docker.com/compose/install/
› docker-compose.exe up -d
Creating network “sshtest_sshtest” with the default driver
Creating sshtest_container1_1 …
Creating sshtest_container1_1 … done
Creating sshtest_container2_1 … done
then if you check your “docker ps” you will see that there is now 2 containers.
Connect to one of them and you should be able to ssh to container1/container2 with login (u2 / u2password)