Ollama – v0.16.3-rc0
🚨 Ollama v0.16.3-rc0 is here! 🚨
Big news for Apple Silicon users: Qwen3 model support has landed in `mlxrunner`! 🍏⚡
✅ Qwen3 models now run natively on M1/M2/M3 Macs via Ollama’s MLX backend — no CUDA, no hassle.
🧠 Alibaba’s latest Qwen3 brings stronger multilingual skills and sharper reasoning, making it a serious contender for local LLM workloads.
That’s the headline — this RC is light on changes but heavy on potential 🎯
Stable drop’s coming soon… in the meantime, go test those Qwen3 models! 🧪✨
