Automation
Function Calling with Local LLMs: Tools, Agents, and Structured Output
Function calling with local LLMs using Ollama and llama.cpp. Qwen 2.5 7B matches GPT-4 accuracy for tool selection. Working code and agentic loop patterns.
Structured Output from Local LLMs: JSON, YAML, and Schemas
Ollama's format parameter guarantees valid JSON from any local model. Grammar constraints in llama.cpp go further — 100% schema compliance at the token level. Methods ranked by reliability, with working code examples.