Permission denied trying to SCP a file from jenkins docker container to apache on host VM

I’m at a loss trying to set permissions to be able to transfer a file from docker container to docker host. Any ideas?

I’m trying to configure a simple Jenkinsfile pipeline to take a new commit from github then deploy an updated index.html from jenkins-cloned repo to apache’s /public_html folder.

Setup is as follows:

  • Docker container running jenkins/jenkins;lts with jenkins being run by default jenkins user in jenkins group
  • Docker hosted on lubuntu VM which also has apache installed as default root user in www-data group
  • I added (my) host_user to www-data via: sudo usermod -a -G www-data hostuser
  • I set public_html to rwx rwx r x via: sudo chmod 775 /var/www//tutorialJenkins_test/public_html
  • I setup ssh between container and host (container storing private key; generated in container as jenkins user; and host storing public key in its .ssh/authorized_keys file

As a result:

  1. host_user in lubuntu host can cp a file to public_html
  2. jenkins user in docker container can scp a file to e.g. /home/hostuser/
  3. Although jenkins build is successful, I get permission denied, when trying to get jenkins user to scp to the public-html folder.

I assume setting chmod 777 to public_html is less than ideal. I tried adding a new user named jenkins to apache’s www-data group but given it’s a different user / UID, / GID, / OS instance, as expected, this did not work.

This is my jenkinsfile:

    agent any
        stages {
            stage("build") {
                steps {
                    script {
                        try {
                            // SSH copy index.html from the jenkins-cloned repo in the docker container, to the docker host:
                            sh 'scp index.html hostuser@[host_ip]:/var/www/tutorialJenkins_test/public_html'
                        } catch (err) {
                                echo "File copy to localhost failed: ${err}"
                                // handle exception..
                            } finally {
                                echo "Build stage complete.."

Your problem is rather a Jenkins pipeline problem, and not a docker problem.
If your agent was anything else than a docker container, you would still have the same issue.

Your problem seems that the private key is missing and is not used for the ssh/scp connection
I will still give you a hint: you need this plugin: SSH Agent.

Mate sorry for raising a non-docker issue in docker forum. You are correct.

Anyway thanks for your input, as you helped me realize that my public_html was owned by root but needed to be owned by my user (website owner).

I ran: chown -R [hostuser]:[hostuser group] [domain folder] after which, jenkins was able to access this [domain folder]/public_html to drop the newly-cloned index.html file into (as I’d already configured SSH between the jenkins user and hostuser).


Judged by the pipeline, and being it just a permission issue sound like you don’t use remote agents at all, or preconfigured them. The ssh agent take care of injecting the private key steps within the sshagent block, without requiring the private key to permanently remain on the remote agent.

Glad you found what causes the problem.