Ollama – v0.12.6-rc0: CI: Set up temporary opt-out Vulkan support (#12614)
Ollama just dropped v0.12.6-rc0! π
What is Ollama? It lets you run large language models locally β think Llama 2, Mistral, and more β right on your Mac or Linux machine. Super handy for testing, building apps, or just playing around without hitting API limits.
Here’s whatβs new:
- Experimental Vulkan Support: Want a potential performance boost? You can now opt-in to use Vulkan! Requires building from source currently.
- CI Updates: Some under-the-hood improvements to the continuous integration setup for smoother sailing.
Want to try Vulkan? Check out the Ollama docs for build instructions! π»
