Ollama – v0.30.0-rc15

Ollama – v0.30.0-rc15

Ollama v0.30.0-rc15 πŸ› οΈ

If you’re running LLMs locally, you know Ollama is the go-to for getting models like Llama 3 and DeepSeek-R1 up and running with zero friction. This latest release candidate brings a specific boost for Windows users looking to squeeze more performance out of their hardware!

What’s new:

  • Windows iGPU Detection via Vulkan: The big highlight here is the improved detection of integrated graphics on Windows.

This is a massive win for anyone running Ollama on laptops or desktops without a dedicated high-end GPU. By better detecting available integrated graphics, Ollama can more effectively leverage your hardware’s compute power to speed up local inference. Keep an eye on those performance gains! πŸš€

πŸ”— View Release