Hi I am a user of the operating system Pop! OS. I am not sure I am in the right place. I am having trouble running something.
I chose Pop! OS over Ubuntu regular because I hoped the video drivers for my GPU would run better for gaming, programming, and science. I have an AMD GPU.
I am trying to run ollama in a docker configuration so that it uses the GPU and it absolutely won’t work. I also am able to run GPT4ALL with Vulkan drivers and it goes fast at text generation.
However, using ollama in a docker is helpful for different programming or experimental applications. I have no idea what I am doing wrong because there are so many guides on what to do. I added amdgpu.deb package from the amd website so that it added a repo and I also installed rocm and amdgpu latest drivers. I also am running a llm rocm docker image from amd, although I am not sure if that’s helping or needed and don’t even understand what it does.
When I run docker with ollama/ollama:rocm it indicates it doesn’t recognize my graphics card.
I’m not sure where to get help.
Logs:
(Error, can’t post logs since apparently they have links in the logs. What an annoying rule for new users trying to post logs.)
I have a AMD® Ryzen 7 8840u w/ radeon 780m graphics x 16 and AMD® Radeon graphics
I could add an external GPU at some point but that’s expensive and a hassle, I’d rather not if I can get this to work. The speed on GPT4ALL is acceptable with Vulkan driver usage. Ollama is clearly using CPU generation based on the slow output. It’s very slow, about 1/10th the speed of the Vulkan generation in GPT4ALL
That is not an issue and the description is not about Docker. I never tried AMD CPUs with Docker. You can find a guide for Nvidia here: Enable GPU support | Docker Docs