Ollama – v0.16.1
π¨ Ollama v0.16.1 is live! π¨
Hey AI tinkerers & local LLM lovers β fresh update incoming! π₯
Whatβs new in v0.16.1?
πΉ New model config added: `minimax-m2.5` π§
- Looks like a fresh MiniMax model variant (internal/experimental for now β keep an eye out for docs!).
- You can already pull it via `ollama pull minimax-m2.5` if youβre feeling adventurous π οΈ
πΉ Lightweight patch release β no breaking changes, just lean & mean model support upgrades.
π¦ Binaries are rolling out for macOS, Windows, and Linux β grab the latest from GitHub or update via your package manager.
Let us know if you get `minimax-m2.5` running β curious to hear your benchmarks and use cases! π§ͺβ¨
