Ollama – v0.16.1

Ollama – v0.16.1

🚨 Ollama v0.16.1 is live! 🚨

Hey AI tinkerers & local LLM lovers β€” fresh update incoming! πŸ”₯

What’s new in v0.16.1?

πŸ”Ή New model config added: `minimax-m2.5` 🧠

  • Looks like a fresh MiniMax model variant (internal/experimental for now β€” keep an eye out for docs!).
  • You can already pull it via `ollama pull minimax-m2.5` if you’re feeling adventurous πŸ› οΈ

πŸ”Ή Lightweight patch release β€” no breaking changes, just lean & mean model support upgrades.

πŸ“¦ Binaries are rolling out for macOS, Windows, and Linux β€” grab the latest from GitHub or update via your package manager.

πŸ‘‰ v0.16.1 Release Notes

Let us know if you get `minimax-m2.5` running β€” curious to hear your benchmarks and use cases! πŸ§ͺ✨

πŸ”— View Release