Ollama – v0.14.0-rc9
π Ollama v0.14.0-rc9 just dropped β and itβs all about silent power for Apple Silicon users! ππ»
The big fix? MLX components are now actually included in the macOS build. No more missing pieces β if youβre running Llama 3, Gemma, or Mistral on your M-series Mac, inference is smoother, faster, and fully optimized.
No flashy new features this round β just clean, reliable polish.
This is the quiet before the storm: v0.14 is so close, and RC9 is your green light to update and test.
Perfect for devs who want rock-solid local LLMs before the big launch.
Update now β your M-chip will thank you. π οΈ
