Is there a way to copy the locally installed package into a docker image so that when the azure function is deploy the package is copied across?

I have a private package I create that is hosted in private bitbucket repository. This repository is not accessible when I deploy to azure.

Is there a way to copy the locally installed package into a docker image so that when the azure function is deploy the package is copied across?

The repository is only available when connected via vpn and the url is not publicly available.

Thanks any and all advice.

I create a docker image and have tried to use the COPY command to copy the package to the image, but it keeps on failing with the following error: ERROR: failed to solve: failed to compute cache key: failed to calculate checksum of ref 4h34y7v4co0wxlbe157escvw2::v2546lxmycfvpvme2gdd57pew: “/azuretest/python_packages/lib/site-packages/testazureclient”: not found Process exited with code 1

You can not access anything outside the build context during image builds.

If you have a structure like this:

/home/me/myprojects/project
                       |- Dockerfile
                       |- other-files-that-make-up-your-code

And you run these commands:

cd /home/me/myprojects/project
docker build -t myimage:latest .

All files from . (=this folder and all its content) will be copied into the build context (unless files are specifically ignored by an entry in a .dockerignore file), and we be available as source in COPY instructions. Thus, in the example everything in /home/me/myprojects/project can be access, but nothing outside that path.

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.