Skip to content

AMD Support

Jeffry Samuel edited this page Dec 22, 2024 · 2 revisions

How Ollama interfaces with AMD GPUs

ROCm is a library used to interact with AMD GPUs, mainly used in machine learning and large language models, this library is required for Ollama to be GPU accelerated with AMD.

Installation

Flatpak

Flatpak doesn't provide the ROCm libraries by default, this is why a ROCm extension was created specifically for Alpaca, to install it look for the extension in your software store inside Alpaca's page or use one of these commands:

# System installation
flatpak install com.jeffser.Alpaca.Plugins.AMD

# User installation
flatpak install --user com.jeffser.Alpaca.Plugins.AMD

If you are unsure of how you installed Alpaca (system / user) you can use the following command to check that:

flatpak list --columns=app,installation | grep com.jeffser.Alpaca

Snap

Snap support for ROCm should work without any problems by default, if that's not the case please open an issue.

Arch Linux AUR Package (Not Official)

When installing Alpaca in Arch, Ollama can be installed like any other package, Arch provides multiple versions of Ollama inside the AUR:

Just install the one that fits your system the best, in this case ollama-rocm-git.

NixPkgs (Not Official)

Please read the installation instructions, there you can learn how to select between ollama-cuda and ollama-rocm.