Ollama – v0.13.0-rc0

Ollama – v0.13.0-rc0

πŸš€ Ollama v0.13.0-rc0 just dropped β€” and it’s packed with power!

Say hello to DeepSeek-V3.1 (aka Deepseek2) β€” one of the most capable open LLMs out there, now available with a simple `ollama pull deepseek-ai/deepseek-v3.1`.

✨ Why it’s awesome:

  • πŸš€ MLA (Multi-Layer Attention) is live β€” cuts memory use, speeds up inference, and keeps reasoning sharp.
  • πŸ› οΈ New engine under the hood = smoother runs, fewer crashes, better future-proofing.
  • πŸ’₯ Run state-of-the-art reasoning on your laptop β€” no cloud needed.

GGUF? Still supported. API? Still there. CLI? Even better.

This isn’t just an update β€” it’s your ticket to running top-tier models locally, faster than ever.

Go grab it:

`ollama pull deepseek-ai/deepseek-v3.1`

#LocalAI #DeepSeek #Ollama #LLM

πŸ”— View Release