Ollama – v0.17.8-rc4
๐จ Ollama `v0.17.8-rc4` is out โ and itโs packing a cleanup! ๐งน
The latest release candidate drops support for experimental aliases, meaning if youโve been relying on model or endpoint aliases (like `ollama run my-alias`), youโll want to double-check your setup โ this will break for alias users unless migrated.
๐ Whatโs new (or rather, gone):
- โ `server: remove experimental aliases support (#14810)` โ yep, aliases are officially axed from the server.
- ๐ฆ Still supports all your favorite models (Llama 3, DeepSeek-R1, Phi-4, Gemma, Mistralโฆ), GGUF included.
- ๐ฅ๏ธ Cross-platform (macOS, Windows, Linux) โ same easy local LLM experience you love.
โ ๏ธ Heads up: This is a release candidate โ so while itโs stable-ish, keep an eye out for the final `v0.17.8` release with polished changelogs (the current GitHub UI is glitching on the notes ๐ ).
๐ก Pro tip: Run `git log v0.17.8-rc3..v0.17.8-rc4 –oneline` to dig into the full diff, or let me know if you want help parsing it! ๐ ๏ธ
Happy local LLM tinkering, folks! ๐คโจ
