Additional Resources:
Forum post
- Shop Micro Center’s Monitor Madness and Current Top Deals: https://micro.center/wuld
- Micro Center’s Networking Solutions: https://micro.center/ftp9
- Micro Center’s Priority Care+: https://micro.center/e8ze
- Sign-Up for Early Access to Micro Center Santa Clara: https://micro.center/t95m
Learn how to optimize your Ollama AI deployments by storing your models on shared NFS storage using TrueNAS. In this step-by-step video, we walk through configuring NFS on TrueNAS and setting up Ollama to use a centralized model directory. This approach is perfect for running Ollama across multiple machines or containers, helping you avoid redundant downloads and keep everything in sync—ideal for both homelab enthusiasts and business environments.
Store Design
Connect With Us
- Hire Us for a project: Hire Us – Lawrence Systems
- Toms’ Twitter
https://twitter.com/TomLawrenceTech
- Our Website https://www.lawrencesystems.com/
- Our Forums https://forums.lawrencesystems.com/
- Instagram https://www.instagram.com/lawrencesystems/
- Facebook Lawrence Systems | Southgate MI
- GitHub lawrencesystems (Lawrence Systems) · GitHub
- Discord Lawrence Systems
Lawrence Systems Shirts and Swag
AFFILIATES & REFERRAL LINKS
Amazon Affiliate Store
Lawrence Systems's Amazon Page
UniFi Affiliate Link
Ubiquiti Store
All Of Our Affiliates help us out and can get you discounts!
Partners We Love – Lawrence Systems
Gear we use on Kit
Kit
Use OfferCode LTSERVICES to get 10% off your order at
Tech Supply Direct - Premium Refurbished Servers & Workstations at Unbeatable Prices
Digital Ocean Offer Code
DigitalOcean | Cloud Infrastructure for Developers
HostiFi UniFi Cloud Hosting Service
HostiFi - Launch UniFi, UISP and Omada in the Cloud
Protect your privacy with a VPN from Private Internet Access
https://www.privateinternetaccess.com/pages/buy-vpn/LRNSYS
Patreon
https://www.patreon.com/lawrencesystems
Chapters
00:00 Centralized Ollama AI Models Using TrueNAS
01:10 Using in Docker
01:41 TrueNAS NFS setup
04:14 Configuring NFS Share Mount For the Ollama Server
06:40 Using Ollama The Models At The Same Time