Ollama – v0.12.6-rc0: CI: Set up temporary opt-out Vulkan support (#12614)

Ollama – v0.12.6-rc0: CI: Set up temporary opt-out Vulkan support (#12614)

Ollama just dropped v0.12.6-rc0! πŸŽ‰

What is Ollama? It lets you run large language models locally – think Llama 2, Mistral, and more – right on your Mac or Linux machine. Super handy for testing, building apps, or just playing around without hitting API limits.

Here’s what’s new:

  • Experimental Vulkan Support: Want a potential performance boost? You can now opt-in to use Vulkan! Requires building from source currently.
  • CI Updates: Some under-the-hood improvements to the continuous integration setup for smoother sailing.

Want to try Vulkan? Check out the Ollama docs for build instructions! πŸ’»

πŸ”— View Release