Ollama – v0.14.0-rc4

Ollama – v0.14.0-rc4

πŸš€ Ollama v0.14.0-rc4 just dropped β€” and it’s fixing the annoying MLX build hiccups on macOS & Docker! πŸ–ΌοΈπŸ’»

If you’ve been trying to run LLaVA or other vision models on Apple Silicon and kept hitting “MLX not found” errors? Say goodbye to the frustration. This patch nails the build scripts so MLX works reliably β€” no more wrestling with toolchains.

βœ… What’s fixed:

  • MLX build scripts now work smoothly on macOS (M-series chips, rejoice!)
  • Dockerfile updated to bundle MLX deps properly for image gen in containers

No flashy new features β€” just stable, reliable local image generation. Perfect for devs prepping for v0.14’s full launch. Keep those M-chips humming and start generating again! πŸš€

πŸ”— View Release