Ollama – v0.17.7
π¨ Ollama v0.17.7 is out! π¨
This patch brings a subtle but important fix under the hood:
πΉ Stale context window entries now get properly overridden β meaning outdated prompt/chat history data wonβt linger and mess with your inference accuracy. π‘
π§ Why youβll care:
- Cleaner, more reliable multi-turn conversations
- Better token efficiency (no hidden bloat from old context!)
- Smoother long-context handling β especially helpful if youβre pushing model limits
π¦ No flashy new models or API changes this time, but itβs a solid reliability bump for everyday use.
π Full details: v0.17.7 Release
Happy local LLM tinkering! π οΈπ€
