Ollama – v0.12.11-rc1

Ollama – v0.12.11-rc1

Ollama v0.12.11-rc1 is here — and Windows GPU fans, rejoice! 🎉

Vulkan support is officially back on track. No more “Vulkan not found” errors — your RTX, RX, or Arc cards can now accelerate LLM inference without a hitch.

This is a quiet patch with huge impact: if you’ve been stuck on CPU-only runs, it’s time to fire up your GPU again.

⚠️ Still a release candidate — but if you’re on Windows and craving faster generations, this is the one to try.

Pro tip: Update your Vulkan drivers first! Ollama’s fixed its end — now let your hardware do the heavy lifting. 🚀

🔗 View Release