In this github repo, they say how to make ollama compatible with my video card:
when using docker ollama/ollama:rocm, I don’t know how to execute these commands inside the docker container and how to test if they worked
@rimelek@rimelek2 rimlek or rimlek2 do you know? i wanted to watch your videos but youtube thinks i am a bot, any other videos not on youtube? not sure how to force command into a docker image
Nice that you found even my test account, but I don’t respond sooner just because I’m mentioned. Otherwise everyone would mention me hoping for a quicker response. I react when I have time and when I have anything to say about the topic. Like I did here:
As I wrote there, the description you found is not for Docker. AMD has a documentation about GPUs with Docker which I shared in the other topic.
I am the person who posted this, but I lost the account email and password when I accidentally reset my computer.
There is a github page showing to get the amdgpu working in ollama. “Ollama could run the iGPU 780M of AMD Ryzen CPU at Linux base on ROCm. There only has a little extra settings than Radeon dGPU like RX7000 series.”
I am trying to follow this guide, but I am running ollama in docker using ollama/ollama:rocm
I am very new to docker without a strong coding background but still trying to figure this out
i saw on this forum there are ways to run code within docker itself. It doesn’t seem like docker apps have terminals. I have a whale application that lets me see the images that are running in containers.
when i run the ollama/ollama:roc I get all sorts of errors. It runs, but it uses the CPUs and is too slow. ollama is used in a container as the backend for other docker images that are on github that I am trying to experiment with
i know rimlek has video on youtube about adding commands to docker. i could try watching those, youtube did ban me at first, but my understanding of docker is so low that even if I watch all those videos I may not understand how to approach this problem to get the best outcome.
rather than trial-and-error to get a solution, i thought it would be better to ask people more knowledgable than me. there may not even be a solution. i am not that smart with computers and am still learning
sorry for finding your test account. i didn’t know what was what. sometimes i have more than 1 account if i forget my password and have to make a new account. i like computers but am slow with stuff
i am using pop-os and kept getting errors when trying to install amdgpu-dkms.
I am at the part where I run the terminal command and I still get errors:
docker run --device /dev/kfd --device /dev/dri --security-opt seccomp=unconfined ollama/ollama:rocm
time=2024-10-12T20:20:07.645Z level=INFO source=gpu.go:347 msg=“no compatible GPUs were discovered”
a rocm 6.2 was released, but this may not help
i am just frustrated because in GPT4ALL, they have the option to select Vulkan AMD GPU and the output is a good speed, it’s not super fast, but it’s not slow
ollama in a docker using CPU is slow. it’s probably 3-4 tokens/sec