Install pip packages in the image, on the host, or both?

For what I know, the common practice of Python development with Docker is to have:

# Install pip requirements
COPY requirements.txt .
RUN python -m pip install -r requirements.txt

In my Dockerfile . This way the packages are installed while building the image. But VSCode doesn’t recognize them, therefore reporting warning:

enter image description here

I always just run pip install -r requirements.txt again on the host. And it works as expected… except it weakens the benefit of Docker. After all, one of the reasons to use Docker is to not worry about how to build all the dependencies on the host, right? For a package with binary dependency, e.g. numpy , now I need to know how to build it on both alpine (the Docker image) and Mac (the host). And it occupies double disk space too.

Is there a way to install the dependencies once, while enjoying VSCode (or other editor/IDE)'s autocompletion?

You might want to take a look at Remote - Containers - Visual Studio Marketplace

It connects the vscode ui of your desktop, with the context of the container. You basicly work inside the container, but render the ui on your desktop.