Centralized Ollama AI Models Using TrueNAS [YouTube Release]

Additional Resources:

Forum post

Learn how to optimize your Ollama AI deployments by storing your models on shared NFS storage using TrueNAS. In this step-by-step video, we walk through configuring NFS on TrueNAS and setting up Ollama to use a centralized model directory. This approach is perfect for running Ollama across multiple machines or containers, helping you avoid redundant downloads and keep everything in sync—ideal for both homelab enthusiasts and business environments.

Store Design

Connect With Us

Lawrence Systems Shirts and Swag

►👕 Lawrence Systems

AFFILIATES & REFERRAL LINKS

Amazon Affiliate Store
:shopping_cart: Lawrence Systems's Amazon Page

UniFi Affiliate Link
:shopping_cart: Ubiquiti Store

All Of Our Affiliates help us out and can get you discounts!
:shopping_cart: Partners We Love – Lawrence Systems

Gear we use on Kit
:shopping_cart: Kit

Use OfferCode LTSERVICES to get 10% off your order at
:shopping_cart: Tech Supply Direct - Premium Refurbished Servers & Workstations at Unbeatable Prices

Digital Ocean Offer Code
:shopping_cart: DigitalOcean | Cloud Infrastructure for Developers

HostiFi UniFi Cloud Hosting Service
:shopping_cart: HostiFi - Launch UniFi, UISP and Omada in the Cloud

Protect your privacy with a VPN from Private Internet Access
:shopping_cart: https://www.privateinternetaccess.com/pages/buy-vpn/LRNSYS

Patreon
:money_bag: https://www.patreon.com/lawrencesystems

Chapters
00:00 Centralized Ollama AI Models Using TrueNAS
01:10 Using in Docker
01:41 TrueNAS NFS setup
04:14 Configuring NFS Share Mount For the Ollama Server
06:40 Using Ollama The Models At The Same Time

1 Like