Text Generation Webui – v3.15

Text Generation Webui – v3.15

llama.cpp & ExLlamaV3 just got a nice polish in text-generation-webui v3.15! 🚀

This release focuses on stability and getting more out of your models with fixes for context size errors, security boosts (immutable `–trust-remote-code`), and improved chat behavior. Plus, those pesky gibberish outputs in ExLlamaV3 v3.14 are fixed 🙌.

Key updates:

  • Context Size Fix: No more changelog errors!
  • Security: `–trust-remote-code` is now immutable for extra security. 🛡️
  • Chat Improvements: Metadata leaking and “continue” spacing fixed.
  • Download Resilience: Better resuming of downloads from Hugging Face.
  • ExLlamaV3 Reverted: Gibberish outputs solved!
  • Core Updates: llama.cpp (f9fb33f2630b4b4ba9081ce9c0c921f8cd8ba4eb) & ExLlamaV3 (v0.0.10)!

Grab the latest portable builds for Windows/Linux (cuda12.4, cuda11.7, vulkan, cpu) or Mac (arm64, x86_64). Just unzip and replace your `user_data` folder to update – all settings & models are preserved! ✨

🔗 View Release