Ollama – v0.17.1
🚨 Ollama v0.17.1 is live! 🚨
This one’s a micro-patch—but a sweet, smooth one:
🔹 Fixed: The first update check was mysteriously delayed by 1 hour 🕒
→ Now, you’ll get version alerts immediately after install or first launch—no more waiting!
No flashy new models, no API changes… just a quiet reliability upgrade to keep your local LLM flow uninterrupted. 🛠️✨
Perfect for keeping your setup fresh, fast, and future-proof! 🚀
(And hey—still supports Llama 3, DeepSeek-R1, GGUF, and all your fave local models!)
