Ollama – v0.21.1

Ollama – v0.21.1

Ollama v0.21.1 ๐Ÿฆฌ

If you’re running local LLMs, you know Ollama is the go-to for getting models up and running with zero friction on macOS, Windows, or Linux. Itโ€™s the ultimate toolkit for anyone looking to experiment with Llama 3, DeepSeek-R1, or Mistral without needing a massive cloud budget.

This quick patch release focuses on fine-tuning model recommendations to ensure you’re getting the best performance out of your local setup.

Whatโ€™s new:

  • Model Optimization: The update swaps out `kimi-k2.5` for `k2.6` as the top recommended model in the launch configuration. ๐Ÿš€

It’s a small but important tweak to make sure your default experience points you toward the most capable version of the Kimi model available! Perfect for those of us always hunting for that extra bit of reasoning power.

๐Ÿ”— View Release