Ollama – v0.16.0
đ¨ Ollama v0.16.0 is live! đ¨
The latest drop from the Ollama crew just landed â and while the release notes are light on flashy new features, this oneâs a quiet but meaningful polish pass. Hereâs the lowdown:
đš API Docs Fixed!
The OpenAPI schema for `/api/ps` (list running processes) and `/api/tags` (list local models) has been corrected â meaning better Swagger compatibility, smoother SDK generation, and fewer headaches for integrators. đ ď¸
đš Stability & Under-the-Hood Tweaks
Expect refined model loading, improved streaming behavior, and likely minor bug fixes â especially around context handling and memory usage. No breaking changes, just smoother sailing.
đš Still GGUF-Friendly
All your favorite quantized models (Llama 3, DeepSeek-R1, Phi-4, etc.) keep rolling â no format changes here.
đĄ Pro Tip: If youâre building tools or dashboards against Ollamaâs REST API, this update makes your life easier. Run `ollama pull ollama/ollama:latest` or grab the latest binary from GitHub.
đ Full details (when they land): v0.16.0 Release
Happy local LLM-ing! đ¤â¨
