Getting Up and Running with Ollama [YouTube Release]

Additional Resources:

Ollama is a free and open-source tool that simplifies running large language models locally.

In this video, Matt takes us through the process of setting up Ollama in a Docker container on a Debian 12 host and directly on a fresh Windows 11 installation.

Matt also demonstrates interacting with models via the command line, cURL, a basic Python script, and Fabric — a tool we use daily at Lawrence Systems.

Ollama:

Official Ollama Docker Image:
https://hub.docker.com/r/ollama/ollama

Fabric:

Connect With Us

Lawrence Systems Shirts and Swag

►👕 Lawrence Systems

AFFILIATES & REFERRAL LINKS

Amazon Affiliate Store
:shopping_cart: Lawrence Systems's Amazon Page

UniFi Affiliate Link
:shopping_cart: Ubiquiti Store

All Of Our Affiliates help us out and can get you discounts!
:shopping_cart: Partners We Love – Lawrence Systems

Gear we use on Kit
:shopping_cart: Kit

Use OfferCode LTSERVICES to get 10% off your order at
:shopping_cart: Tech Supply Direct - Premium Refurbished Servers & Workstations at Unbeatable Prices

Digital Ocean Offer Code
:shopping_cart: DigitalOcean | Cloud Infrastructure for Developers

HostiFi UniFi Cloud Hosting Service
:shopping_cart: HostiFi - Launch UniFi, UISP and Omada in the Cloud

Protect your privacy with a VPN from Private Internet Access
:shopping_cart: https://www.privateinternetaccess.com/pages/buy-vpn/LRNSYS

Patreon
:moneybag: https://www.patreon.com/lawrencesystems