Text Generation Webui – v3.23
✨ Chat UI got a glow-up! Tables and dividers now look clean, crisp, and way easier to read—perfect for scrolling through long model outputs without eye strain.
🔧 Bug fixes that actually matter:
- Models with `eos_token` disabled? No more crashes! Huge props to @jin-eld 🙌
- Symbolic link issues in `llama-cpp-binaries` fixed—non-portable installs breathe easier now.
🚀 Backend power-up:
- `llama.cpp` updated to latest commit (`55abc39`) → faster, smoother inference
- `bitsandbytes` bumped to 0.49 → better quantization, fewer OOMs, more stable loads
📦 PORTABLE BUILDS ARE LIVE!
Download. Unzip. Run. No install needed.
- NVIDIA? → `cuda12.4`
- AMD/Intel GPU? → `vulkan`
- CPU-only? → `cpu`
- Mac Apple Silicon? → `macos-arm64`
💾 Updating? Just grab the new zip, unzip, and drop your old `user_data` folder in. All your models, settings, themes—still there. Zero reconfiguring.
Go play. No setup. Just pure LLM magic. 🚀
