This is a basic question for which I didn’t see anything on in the docs or forum searches. I have a development machine which has all kinds of stuff installed to do machine learning, like anaconda, python, etc. I want to containerize all of my development using Docker. Would it be a good practice to uninstall everything of importance on the machine, since everything is going to be running on containers? I guess the main question I have is whether my docker build can/will use anything from my local host machine when building the container. Sorry if this is a simple question- I’m new to this.
@amohap2 you don’t have to clean your computer to be more ready for Docker. Containers contain everything they need when downloaded & ran, so what you have on your own computer doesn’t affect those. And if/when you are building your own containers, you are deterministic what you copy from host computer to image. Most of the time you add commands to Dockerfile (which determine how image is built) which download/install necessary tools from repo’s / directly from Internet sources.
So fear not, and worry not, no need to have huge cleanup operation before starting. Even after you have done many images of your own, you can push them to hub/external storage and make clean install to that computer, then pull those back and continue like nothing has happened.
All that is beauty of Docker.
./dpdeploy.sh destroy --all
This ensures that no containers are running. If you see any, kill them with docker kill.
Go to Initialize DataPlane and run the original DataPlane deployment commands starting with ./dpdeploy.sh init --all.