Ollama – v0.30.0-rc4: ci: windows mlx tuning
Ollama v0.30.0-rc4 π¦
If you’re looking to run large language models locally with ease, Ollama remains the gold standard for managing and interacting with LLMs on your own machine. This latest release candidate brings some much-needed optimization and fixes to the build process!
Whatβs new in this release:
- Windows MLX Tuning: Significant updates to the CI pipeline specifically focused on tuning MLX performance for Windows environments. If you’re running locally on Windows, expect smoother execution! π οΈ
- Build Optimization: The team has shortened the “long-tail” during the build process, making the development and deployment cycle much snappier.
- Installer Fix: A crucial fix to bring `OllamaSetup.exe` back under the 2GB limit. This ensures smoother downloads and prevents those pesky installation hurdles caused by file size limits.
Great news for the Windows crowd looking to squeeze more performance out of their local setups! π
