Ollama – v0.17.0-rc2

Ollama – v0.17.0-rc2

Ollama v0.17.0‑rc2 just dropped – here’s the quick rundown for your local LLM playground:

What Ollama does

A cross‑platform framework that lets you spin up open‑source models (Llama 3, Gemma, Mistral, etc.) locally via a simple CLI and REST API. Perfect for tinkering without cloud lock‑in.

What’s new in v0.17.0‑rc2

  • Web‑search plugin auto‑install
  • The `cmd/config` step now silently drops the web‑search extension into your user extensions folder, giving you instant internet‑augmented generation—no manual copy‑paste needed.
  • Dependency & build tweaks
  • Updated Go modules for smoother runs on macOS 13+ and Linux glibc 2.35+.
  • Compile‑time optimizations shave ~5 % off cold‑start latency.
  • Improved error handling
  • Clearer messages when a model fails to load (e.g., missing GGUF metadata).
  • Graceful fallback to the previous stable version if a plugin crashes during init.
  • Platform polish
  • Fixed ARM64 Windows crash on first run.
  • Added missing `README.md` links for the new web‑search plugin in release assets.
  • Docs bump
  • New “Getting Started with Plugins” guide and refreshed CLI flag table.

🚀 TL;DR: automatic web‑search plugin install, snappier startup on newer OSes, clearer errors, and a handful of stability fixes. Happy experimenting!

🔗 View Release