How to sync data of two container running on different hosts

Hi, everyone.
I have two servers one is Testing and other is production There are two containers running on both servers; one is MySQL and the other is Moodle. Now that developers have finished their work on the testing server, I want to move all the data to the production server container . Please help

container detail

bitnami/moodle 3.11

mariadb

What does “all the data” mean? Where is this data now? (image?container?volume?smth else?)

Thanks for your email I want to move volumes

I was a second confused about “thank for your email”, I guess you mean an email notification you received :slight_smile:

Are we talking about named volumes or bind mount volumes? Please share how the containers are created, so we can see which one it actualy is.

This is how I created moodle container on both the testing and production server

docker run -d --name moodle -p 80:8080 -p 443:8443 -v moodle-data:/bitnami/moodle --network moodle-network -e MOODLE_DATABASE_HOST=moodledb -e MOODLE_DATABASE_USER=moodle -e MOODLE_DATABASE_PASSWORD=xxxxxxxx -e MOODLE_DATABASE_NAME=moodle bitnami/moodle:latest

the developer completed the installation of plugins and themes on the testing server now I want to do that this is not done again on the production server just sync or copy the volumes or any docket technique which I adopt please help me in this

The safest way is to use a disposable utility container that creates a tar archive on the source host from the volume and another disposable utily container that extracts the tar archive on the target host into a volume.

Make sure to stop the service before you archive the data to prevent inconsisteny.
Creating the archive and storing it on the host system can be done like this:

docker run -ti --rm  -v moodle-data:/data -v /data/backup:/backup alpine tar czvf /backup/data.tar.gz -C /data .

As a result the tar file should be on your host in the /data/backup folder. You can adapt the approach to restore the data on the target host. If the volume doesn’t exist on the target host, create it, then restore the files into the volume with a utiliy container. Once this is done start the container with the volume attached and be good.

Further thoughts:
I must admit I find it concerning that there seem to be no team member that knows this kind of stuff. Righ now it seems like it is planed to go in production with a technology that no team member mastered. Frankly, this is huge risk that should be mitigated by training the team members before hands or making a new hire.

1 Like

As i am new to Docker please can you explain your example more how i fit this in my environment as i have moodle:latest image i try with you example but /data/bakup create blank .gz file no data there and also send me restore command on destination server.

Any update on this please

Have you used the exact command like I wrote it in the code block, including the trailing . character?

Furthermore, you will have to adapt the approach to your needs, as I have no idea about your full setup/situation and the details shared so far don’t allow to be very specific.

The approach I desribed creates a utility container that,
a) maps the named volume moodle-data into the container folder /data
b) maps the local path /data/backup into the contaienr folder /backup
c) uses the alpine image as the utility container
d) uses the tar command to create a tar.gz file /backup/data.tar.gz (which is in the path from b) and uses the /data folder as context (which is the path from a) and archives each file and folder in the context (which is the . at the end)

Please make yourself aquinted on how to create and extract archives with tar.

Please accept my apologies for bothering you with questions. Docker compose is used to run this container.either below command is still use for backup ?

docker run -ti --rm  -v moodle-data:/data -v /data/backup:/backup alpine tar czvf /backup/data.tar.gz -C /data .

Doesn’t matter.

There is no need to apologize. I am just hesitant with my responses, as I find it extremly concerning when someone tries to go in production with a technology no team member has the required skills to do so. Just making it “work” isn’t good enough, as the team needs to have the skills to solve operation issues.

May I suggest this free self-paced docker training? Introduction to Containers
It provides a solid foundation of the concepts and how things are done in docker. Make sure to not skip the hands-on exercices.

1 Like