Ollama – v0.30.0-rc1
Ollama v0.30.0-rc1 π¦¬
If you haven’t jumped on the Ollama train yet, now is the time! Itβs the ultimate go-to tool for running powerful large language models like Llama 3, DeepSeek-R1, and Mistral locally on your machine with zero friction. It handles all the heavy lifting of model management and serving so you can focus on building cool stuff.
This latest release candidate (rc1) is a focused stability update:
- Windows MLX Build Fix: The team has pushed a fix specifically for the Windows MLX build process. If you’ve been experimenting with MLX-related workflows on Windows, this should smooth out those compilation hiccups! π οΈ
It looks like a targeted patch to keep your local LLM engine running buttery smooth across different environments. Keep an eye out for more feature-heavy lifting in the upcoming full release!
