If you’re running multiple machines or VMs with Ollama and want to avoid redownloading large language models on each one, it’s incredibly handy to store all models on a shared NFS mount. This not only saves space but also speeds up setup and syncs model access across systems.
First setup a dataset and share in TrueNAS for NFS
Create a dataset for the Ollama models and an NFS share as outlined in the TrueNAS Doc
SSH into TrueNAS (or use their terminal in the web interface) Create a directory in that dataset in that called models
and make sure the permissions are set to user 999 and group 996
chown -R 999:996 models
Here’s how to mount an NFS share on Debian 12 and redirect Ollama to use it for model storage.
Step-by-Step: Move Ollama Model Storage to NFS
1. Install NFS support (if not already installed):
sudo apt update
sudo apt install nfs-common
2. Create the local mount point:
sudo mkdir -p /mnt/ollama_models
3. Mount the NFS share:
Replace the IP and path with your actual NFS export.
sudo mount -t nfs 172.16.16.4:/mnt/Rusty/ollama-models /mnt/ollama_models
Check it mounted:
df -h | grep ollama_models
4. Make the mount persistent:
Edit /etc/fstab
:
sudo vim /etc/fstab
Add this line:
172.16.16.4:/mnt/Rusty/ollama-models /mnt/ollama_models nfs defaults,_netdev 0 0
Then mount everything:
sudo mount -a
5. Redirect Ollama to Use the Shared Model Path
By default, Ollama stores downloaded models in:
/usr/share/ollama/.ollama/models
If you have model files downloaded, copy them to the NSF share.
To remove the existing models folder and create a symlink:
sudo systemctl stop ollama
sudo rm -rf /usr/share/ollama/.ollama/models
sudo ln -s /mnt/ollama_models/models /usr/share/ollama/.ollama/models
sudo systemctl start ollama
Verify It’s Working
Run:
ollama list
You should see all your models listed, even though they’re now living on the NFS share.
Docker: Using NFS for Ollama Model Volume
If you’re running Ollama in a Docker container:
Docker Run
docker run -d \
--name ollama \
-v /mnt/ollama_models/models:/root/.ollama/models \
-p 11434:11434 \
ollama/ollama
Docker Compose
version: '3.8'
services:
ollama:
image: ollama/ollama
container_name: ollama
ports:
- "11434:11434"
volumes:
- /mnt/ollama_models/models:/root/.ollama/models
restart: unless-stopped
This mounts the models from the NFS share into the container.
Make sure your NFS share is mounted on the host first, using the same steps from above.