Ollama – v0.12.11
π Ollama v0.12.11 just dropped β and itβs a quiet gem for the detail-oriented folks!
The big win? `logprob` now includes byte-level data π―
No more guessing which bytes map to your tokens. Whether youβre debugging multilingual text, tracking tokenization edge cases, or building precision prompt tools β you now see exactly whatβs happening at the byte level.
Perfect for:
- Prompt engineers wrestling with weird token splits
- Researchers analyzing model confidence down to the byte
- Devs building LLM debuggers or token analyzers
No UI fluff, no breaking changes β just pure, nerdy utility.
Upgrade with `ollama pull` and start seeing the hidden layers beneath your prompts. π‘
