Ollama – v0.15.0-rc4
Big news for local LLM folks! π Ollama v0.15.0-rc4 just dropped β and itβs got a quiet game-changer:
`ollama config` is now `ollama launch` π―
No more confusion between “configuring” and “starting” your server.
Just run `ollama launch` to fire up your local LLM β clean, intuitive, and way more obvious.
Your existing configs? Still there.
Your scripts? Time to update those aliases! π οΈ
Under the hood: smoother model loading, better stability, and a few sneaky performance tweaks.
Next stop: stable v0.15.0 π
Time to refresh your workflow β your local LLM stack just got simpler.
