Text Generation Webui – v3.23
โจ Chat UI got a glow-up! Tables and dividers now look clean, crisp, and way easier to readโperfect for scrolling through long model outputs without eye strain.
๐ง Bug fixes that actually matter:
- Models with `eos_token` disabled? No more crashes! Huge props to @jin-eld ๐
- Symbolic link issues in `llama-cpp-binaries` fixedโnon-portable installs breathe easier now.
๐ Backend power-up:
- `llama.cpp` updated to latest commit (`55abc39`) โ faster, smoother inference
- `bitsandbytes` bumped to 0.49 โ better quantization, fewer OOMs, more stable loads
๐ฆ PORTABLE BUILDS ARE LIVE!
Download. Unzip. Run. No install needed.
- NVIDIA? โ `cuda12.4`
- AMD/Intel GPU? โ `vulkan`
- CPU-only? โ `cpu`
- Mac Apple Silicon? โ `macos-arm64`
๐พ Updating? Just grab the new zip, unzip, and drop your old `user_data` folder in. All your models, settings, themesโstill there. Zero reconfiguring.
Go play. No setup. Just pure LLM magic. ๐
