Ollama – v0.17.7
๐จ Ollama v0.17.7 is out! ๐จ
This patch brings a subtle but important fix under the hood:
๐น Stale context window entries now get properly overridden โ meaning outdated prompt/chat history data wonโt linger and mess with your inference accuracy. ๐ก
๐ง Why youโll care:
- Cleaner, more reliable multi-turn conversations
- Better token efficiency (no hidden bloat from old context!)
- Smoother long-context handling โ especially helpful if youโre pushing model limits
๐ฆ No flashy new models or API changes this time, but itโs a solid reliability bump for everyday use.
๐ Full details: v0.17.7 Release
Happy local LLM tinkering! ๐ ๏ธ๐ค
