Docker Postgres - unable to backup database

Steps taken:

  1. $docker run --name psql-0 -e POSTGRES_PASSWORD=password -d -p 5432:5432 postgres: alpine

  2. $docker exec -it psql-0 psql -U postgres

Everything worked perfectly, I created my database and my tables. No problem, until I decided to backup my database.

I tried using pgAdmin to backup my database. (Right click on the database and there’s a backup option).

That was where I got stuck. I ran into errors such as ‘no such file or directory’, ‘permission denied’, etc.

In fact the path below didn’t even exist when I tried searching for it or maybe I didn’t know how. Ps. I used to PuTTY to access the Linux Oracle VM (VirtualBox).

Sorry I’m too new to all these. My first time using Docker, first time exploring Linux OS, not a computer literate. Please help :frowning:

When you run postgres inside a docker container, it stores its data in something called a volume. The path in your screenshot is that of a volume. Volumes are meant be accessed only from containers. In your case, you can only access it from the container pgsql-0. You need to be logged in as root if your want to access it from the Linux VM directly.

Here’s an alternative method for backing up your database. Use the following command:

docker exec pgsql-0 pg_dump -U postgres <database_name> > backup.sql

This will run the pg_dump command “inside” the container called pgsql-0, and store the results in a file called backup.sql. You can restore the backup using the command:

docker exec -i <postgres_container_name> psql -U postgres -d <database_name> < backup.sql
1 Like

Thank you so much for your help!
I wish to access it from my Linux VM directly. How should I do it?

I cannot find this directory or the binary path, etc.
It’s probably not that simple. Hehe.
I should just tell you my objective. In the end, I want to copy the sql file to another machine. Sort of like, transferring my database. Any better way to do it?

The better way is to backup and restore, the way I showed you above. If you run the command Ibelow on your Linux VM, you’ll get a backup.sql file in the same directory, which you can just copy over and import on another postgresql installation.

docker exec pgsql-0 pg_dump -U postgres <database_name> > backup.sql

I’ve tried that before. That was where I got all the permission issue.

$ docker exec pgsql-0 pg_dump -U postgres claudbase > backup.sql
bash: backup.sql: Permission denied

Are you able to solve this?

Sounds like you issued the command in a directory where your logged-in user doesn’t have permissions. So the docker exec command worked, but the final step of storing its output to the .sql file didn’t. Try this:

docker exec pgsql-0 pg_dump -U postgres claudbase > backup.sql

The cd command will change to the home directory of the currently logged-in user, where permissions are (almost) certainly present. After that, the backup.sql file should be successfully created.

1 Like

Oh my goodness. I think it worked!
How do I make sure that it’s there?
Where is this file saved? In the end, I need to transfer it to another machine.

It is saved in your Linux VM, in the “home” directory of the logged-in user. To find out which directory that is, type pwd and press enter (stands for print working directory. To verify that the backup.sql file exists, type ls -l and press enter. This will show you a list of files in the current directory.

Apologies if you already know all this: you mentioned that you are new to Linux, so I have included a little more detail than I would otherwise.

Once you have verified that the file is present, you have many options to transfer it. Since you already use PuTTY, the simplest option is PuTTY SCP.

Please do not apologize. I am very grateful for your help. Yes, I am new to Linux. :frowning_face:
I did my homework a little so I roughly know what you said.
Also, I got this,

Here’s the part that I got stuck for days. I can’t find the correct path, whether it’s for binany, data, etc. At most, I managed to find the path of my volume which is …


Your situation is a little more complicated than it would have been if you were using just Linux, instead of Linux and Docker. Docker containers are a way of running applications such that each application thinks it is running on a machine by itself. That way, two applications cannot interfere with each other’s data or settings. To make this possible, docker performs all kinds of tricks. As a result of these tricks, accessing container files becomes, as I said, a little more complicated.

The path that you saw is controlled by docker itself, and is protected by permission settings. You are not supposed to manipulate that by hand; docker makes it available to the container via a process called mounting. If you really, really want to see that directory, use the command:

sudo ls -l /var/lib/docker

The sudo command will run the ls command with elevated permissions, which are required to see that directory.

But as you saw with the backup method above, there are other ways to get data in and out of a docker container. Using those is preferred.

Finally, at the time of creating a container itself, you can specify the directory that you want the container to use yourself. Assuming you have a directory called /home/user1/mydata, the command to create the container is something like this:

docker run -d 
                  --name pgsql-1 
                  -e POSTGRES_PASSWORD=password 
                  --publish 5432:5432 
                  --mount type=bind,target=/var/lib/postgresql/data,source=/home/user1/mydata 

Correct me if I’m wrong.
When I entered,

$docker run --name psql-0 -e POSTGRES_PASSWORD=password -d -p 5432:5432 postgres: alpine

Since I did not specify my directory, Docker automatically created a volume for me (where it’s solely managed by Docker). Thus, I cannot do anything to it.

So, I need to bind/mount it elsewhere?

Does it mean that for my current situation, there’s no way for me to copy that backup.sql file to another machine? I need to create a new postgres container with an accessible directory specified?

Sorry if I am not making any sense. I’m not even sure if I know what I’m talking about. Haha.

Since I did not specify my directory, Docker automatically created a volume for me (where it’s solely managed by Docker). Thus, I cannot do anything to it.

This is correct.

Now, your intention is to transfer your data to another machine. For that, you do not need access to that volume. The docker exec command we tried earlier makes postgres create a copy of the data in your database. This copy is stored in the file called backup.sql, which is available to you in the directory where you issued the command.

If you gained access to the volume, you would not see a .sql file there. Instead, you would see a number of files that the postgres server maintains for its own purpose. Copying these files elsewhere is risky; instead, you should export your data to a .sql file like we have done.

My last suggestion, the one involving bind/mount, would allow you to tell docker that it should use a directory that you specify, instead of creating a volume. In hindsight, I should not have suggested that at all. The better way to transfer postgres data is export/import.

Sounds like I am almost there!

I am trying to look for the file.
I am actually using Windows 10, running docker on Linux VM.

docker exec pgsql-0 pg_dump -U postgres claudbase > backup.sql

So when I entered the lines above, I should be at my home directory.
Where would that be on my windows? Sorry, this is probably a ridiculous question. xD

This might help.

If you used PuTTY to log in to the Linux VM, and then issued the command, then the file is inside your Linux VM. We will need to get it out of that into your Windows.

From your last screenshot, it looks like you have installed the Docker Toolbox. If you gave the command from that command line, then the file is already in your Windows. Type pwd to find which directory it is in.

If the file is inside your Linux VM, use PuTTY SCP to copy it to your Windows, at the location of your choice.

1 Like

Thank you!
Thank you!
Thank you!

I found the sql file and I was able to import it too!
Actually, I didn’t need PuTTY at all. There wasn’t a need to access my Linux VM. But I’ve learnt a heck lot. Totally worth it.

Thank you so much! You were extremely helpful (and insanely patient).
Thanks again! shakes your hand vigorously

My pleasure. All the best.


I encountered this error. I’m trying to import the database (backup.sql) to another server using pgAdmin.

How do I solve this?

I think it has something to do with the version. Probably not compatible.