Ollama – v0.13.0-rc0
π Ollama v0.13.0-rc0 just dropped β and itβs packed with power!
Say hello to DeepSeek-V3.1 (aka Deepseek2) β one of the most capable open LLMs out there, now available with a simple `ollama pull deepseek-ai/deepseek-v3.1`.
β¨ Why itβs awesome:
- π MLA (Multi-Layer Attention) is live β cuts memory use, speeds up inference, and keeps reasoning sharp.
- π οΈ New engine under the hood = smoother runs, fewer crashes, better future-proofing.
- π₯ Run state-of-the-art reasoning on your laptop β no cloud needed.
GGUF? Still supported. API? Still there. CLI? Even better.
This isnβt just an update β itβs your ticket to running top-tier models locally, faster than ever.
Go grab it:
`ollama pull deepseek-ai/deepseek-v3.1`
#LocalAI #DeepSeek #Ollama #LLM
