Ollama – v0.16.0
๐จ Ollama v0.16.0 is live! ๐จ
The latest drop from the Ollama crew just landed โ and while the release notes are light on flashy new features, this oneโs a quiet but meaningful polish pass. Hereโs the lowdown:
๐น API Docs Fixed!
The OpenAPI schema for `/api/ps` (list running processes) and `/api/tags` (list local models) has been corrected โ meaning better Swagger compatibility, smoother SDK generation, and fewer headaches for integrators. ๐ ๏ธ
๐น Stability & Under-the-Hood Tweaks
Expect refined model loading, improved streaming behavior, and likely minor bug fixes โ especially around context handling and memory usage. No breaking changes, just smoother sailing.
๐น Still GGUF-Friendly
All your favorite quantized models (Llama 3, DeepSeek-R1, Phi-4, etc.) keep rolling โ no format changes here.
๐ก Pro Tip: If youโre building tools or dashboards against Ollamaโs REST API, this update makes your life easier. Run `ollama pull ollama/ollama:latest` or grab the latest binary from GitHub.
๐ Full details (when they land): v0.16.0 Release
Happy local LLM-ing! ๐คโจ
