I have a question, can i use nvidia-docker on my windows machine
I need to “download” model using RunwayML
Hey @mrtajniak, Brannon from RunwayML here .
Nvidia-docker, which we use to run models for Local GPU, doesn’t support Windows. Therefore, Local GPU is a Linux only feature. Here is a link for you to get started if you have access to a Linux machine or a way to install one.
Can I use nvidia-docker when I have the “Windows subsystem” feature enabled?
So how about this by now? Would it work via WSL2 ? Nvidia itself says it should https://docs.nvidia.com/cuda/wsl-user-guide/index.html#installing-nvidia-docker