Question about Python Virtual Environment in docker

Hello guys, I have some questions on how Python virtual environment (python -m venv) works in docker containers.

Context

  1. On my local openSUSE machine, I have an already created virtual environment .venv under workspace directory.
  2. I use VSCode’s Dev Container to open workspace in a Tensorflow container, and workspace is mounted in the container.
  3. When I activate .venv from the container, Python will not use the packages in .venv. It seems that Python is still using system-wide packages. I checked this by running pip3 list --local when .venv was activated, and it gave a result that is exactly the same as when .venv is not activated.
  4. When I create a new virtual environment under workspace from the container, it works and Python will use the packages in the new virtual environment.
  5. Python version is 10 on the local machine, and 8 in the container.

Question

So why is Python in the container still using system-wide packages when the virtual environment (that is created on the local machine) is activated? Is this related to how Python is configured in the Tensorflow container?

Why do you expect python inside the dev container to depend on the python installation/configuration of your host? A container is an isolated process (which can have child processes) in an isolated environment.

High likely this is the default behavior inherited by the image. It could also be configuration of the dev container.

Ff dev container allows configuring volumes, you could try to bind your host folder responsible for the desired configuration into the container folder, where the container expects it.

1 Like