Ollama – v0.13.5
π Ollama v0.13.5 just dropped β and itβs a quiet game-changer for Gemma users!
Now you can use function calling with Gemma 2B locally β yes, really. Trigger webhooks, query databases, fetch weather, or call APIs directly from your tiny-but-mighty local Gemma model. No cloud needed.
π‘ Why itβs cool:
- Function calling was already in Llama 3 & Mistral β now Gemma joins the party.
- Perfect for building private, lightweight AI agents that do stuff, not just chat.
Under the hood: parser fixes + smoother rendering = fewer hiccups, more flow.
Upgrade in one line:
“`bash
ollama pull gemma:2b
“`
Go build something that acts β not just responds. π―
