Ollama – v0.17.8-rc4

Ollama – v0.17.8-rc4

🚨 Ollama `v0.17.8-rc4` is out — and it’s packing a cleanup! 🧹

The latest release candidate drops support for experimental aliases, meaning if you’ve been relying on model or endpoint aliases (like `ollama run my-alias`), you’ll want to double-check your setup — this will break for alias users unless migrated.

🔍 What’s new (or rather, gone):

  • ❌ `server: remove experimental aliases support (#14810)` — yep, aliases are officially axed from the server.
  • 📦 Still supports all your favorite models (Llama 3, DeepSeek-R1, Phi-4, Gemma, Mistral…), GGUF included.
  • 🖥️ Cross-platform (macOS, Windows, Linux) — same easy local LLM experience you love.

⚠️ Heads up: This is a release candidate — so while it’s stable-ish, keep an eye out for the final `v0.17.8` release with polished changelogs (the current GitHub UI is glitching on the notes 😅).

💡 Pro tip: Run `git log v0.17.8-rc3..v0.17.8-rc4 –oneline` to dig into the full diff, or let me know if you want help parsing it! 🛠️

Happy local LLM tinkering, folks! 🤖✨

🔗 View Release