Roasts & Ruminations {blog}

Month: March 2026

  • Running Local LLMs

    Running Local LLMs

    Overview I run local LLMs for several reasons: No matter what the reason, running AI models locally is not complicated to get started. It can be complex, but that is a choice. There are a couple of projects out there that make pretty straight forward to get started This will provide a basic chat setup…