November 1st, 2026
Via del Pilastro, 2
40127 Bologna
Italy
Ultimate Local AI
Running inference against LLM models that run on your own local hardware means no API costs, no data leaving your machine, and no vendor lock-in. Having the ability to do this with Go has traditionally been painful.
In this workshop, Bill will introduce Kronk, a Go SDK that lets you embed local model inference directly into your Go applications with full GPU acceleration — no CGO required. Whether you are building chat, vision, audio, embedding, or tool calling applications, Kronk gives you the same power as a model server without needing one.
To prove it, Bill built a model server using the Kronk SDK, complete with caching, batch processing, and agent support. You'll see live demos from writing your first chat app to driving a coding agent with a local model.
November 2nd, 2026
Via del Pilastro, 2
40127 Bologna
Italy
November 3rd, 2026
Via del Pilastro, 2
40127 Bologna
Italy

