Discover new software and hardware to get the best out of your network, control smart devices, and secure your data on cloud services. Self-Hosted is a chat show between Chris and Alex two long-time "self-hosters" who share their lessons and take you along for the journey as they learn new ones. A Jupiter Broadcasting podcast showcasing free and open source technologies you can host yourself.

119: Why So Many Llamas?

March 20, 2024 47:30 39.9 MB Downloads: 0

Alex rolls back a major server upgrade, and we have fun playing with local large language models.

Special Guest: Wes Payne.

Sponsored By:

Support Self-Hosted

Links:

  • ⚡ Grab Sats with Strike Around the World — Strike is a lightning-powered app that lets you quickly and cheaply grab sats in over 36 countries.
  • 🎉 Boost with Fountain FM — Fountain 1.0 has a new UI, upgrades, and super simple Strike integration for easy Boosts.
  • Training AI to Play Pokemon with Reinforcement Learning
  • Open WebUI — Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI for various LLM runners, supported LLM runners include Ollama and OpenAI-compatible APIs.
  • Ollama — Get up and running with large language models, locally.
  • Alex's Config
  • tlm — Local CLI Copilot, powered by CodeLLaMa.
  • LM Studio - Discover, download, and run local LLMs — 🤖 - Run LLMs on your laptop, entirely offline 👾 - Use models through the in-app Chat UI or an OpenAI compatible local server 📂 - Download any compatible model files from HuggingFace 🤗 repositories 🔭 - Discover new & noteworthy LLMs in the app's home page
  • Hugging Face — The Home of Machine Learning
  • 🍔 Lunch at SCaLE 🍇 — Let's put an official time down on the calendar to get together. The Yardhouse has always been a solid go-to, so sit down and break bread with the Unplugged crew during the lunch break on Saturday!