Ollama – v0.14.0-rc3
Ollama v0.14.0-rc3 just landed β and itβs got web smarts! π
Say goodbye to outdated answers. Now you can:
- π Use `–web-search` to let your model hunt down live info on the fly
- π Use `–web-fetch` to pull content from any URL and feed it straight into your LLM
Ask “Whatβs the latest on Mars rover discoveries?” β and Ollama actually checks. No more 2023 brain fog.
Perfect for RAG pipelines, research bots, or just keeping your AI in the loop.
Works on macOS, Windows, Linux β same slick CLI you already love.
Still a release candidate, but this feels like the start of something wild.
Keep your models curious. π§ β¨
