<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>Centralized Ollama AI Models Using TrueNAS</title>
        <link>https://stream.echo6.co/videos/watch/5c4aa53f-f5f3-427f-8a7c-4f2e48b8954d</link>
        <description>https://lawrence.video/truenas Forum post https://forums.lawrencesystems.com/t/centralized-storage-for-ollama-ai-models-using-nfs/24547 -- We're at the end of March Madness and it's time to score big on Monitor Madness deals at Micro Center! Head to your local Micro Center or microcenter.com, to score big! Priority Care+ is live at Micro Center – Priority Care+ gives you peace of mind for all your devices no matter where you purchased them from. Sign-up for Priority Care+ and have unlimited access to Micro Center’s tech experts and free diagnostics! Shop Micro Center’s Monitor Madness and Current Top Deals: https://micro.center/wuld Micro Center's Networking Solutions: https://micro.center/ftp9 Micro Center’s Priority Care+: https://micro.center/e8ze Sign-Up for Early Access to Micro Center Santa Clara: https://micro.center/t95m -- Learn how to optimize your Ollama AI deployments by storing your models on shared NFS storage using TrueNAS. In this step-by-step video, we walk through configuring NFS on TrueNAS and setting up Ollama to use a centralized model directory. This approach is perfect for running Ollama across multiple machines or containers, helping you avoid redundant downloads and keep everything in sync—ideal for both homelab enthusiasts and business environments. Store Design https://lawrence.video/storagedesign Connect With UsHire Us for a project: https://lawrencesystems.com/hire-us/, Toms' Twitter 🐦 https://twitter.com/TomLawrenceTech, Our Website https://www.lawrencesystems.com/, Our Forums https://forums.lawrencesystems.com/, Instagram https://www.instagram.com/lawrencesystems/, Facebook https://www.facebook.com/Lawrencesystems/, GitHub https://github.com/lawrencesystems/ Lawrence Systems Shirts and Swag, ►👕 https://lawrence.video/swag/ AFFILIATES &amp; REFERRAL LINKSAmazon Affiliate Store 🛒 https://www.amazon.com/shop/lawrencesystemspcpickup UniFi Affiliate Link 🛒 https://lawrence.video/unifi-affiliate All Of Our Affiliates help us out and can get you discounts! 🛒 https://lawrencesystems.com/partners-we-love/ Gear we use on Kit 🛒 https://kit.co/lawrencesystems Use OfferCode LTSERVICES to get 10% off your order at 🛒 https://www.techsupplydirect.com?aff=2 Digital Ocean Offer Code 🛒 https://m.do.co/c/85de8d181725 HostiFi UniFi Cloud Hosting Service 🛒 https://hostifi.net/?via=lawrencesystems Protect your privacy with a VPN from Private Internet Access 🛒 https://www.privateinternetaccess.com/pages/buy-vpn/LRNSYS Patreon 💰 https://www.patreon.com/lawrencesystems Chapters 00:00 Centralized Ollama AI Models Using TrueNAS 01:10 Using in Docker 01:41 TrueNAS NFS setup 04:14 Configuring NFS Share Mount For the Ollama Server 06:40 Using Ollama The Models At The Same Time</description>
        <lastBuildDate>Wed, 15 Apr 2026 15:36:36 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>PeerTube - https://stream.echo6.co</generator>
        
        <copyright>All rights reserved, unless otherwise specified in the terms specified at https://stream.echo6.co/about and potential licenses granted by each content's rightholder.</copyright>
        <atom:link href="https://stream.echo6.co/feeds/video-comments.xml?videoId=5c4aa53f-f5f3-427f-8a7c-4f2e48b8954d" rel="self" type="application/rss+xml"/>
    </channel>
</rss>