Llamatik Blog

Welcome to the official blog of Llamatik — your source for updates, insights, and behind-the-scenes looks into the development of our powerful Kotlin Multiplatform library for llama.cpp. Whether you’re building on-device chatbots, integrating LLMs into native apps, or running inference from a Ktor-powered HTTP server, we’re here to share our journey and help you along yours. Stay tuned for release notes, technical deep dives, and community contributions.

Latest Post

Jul 22, 2025

“Introducing Llamatik Offline LLMs for Kotlin Multiplatform”

🦙 Introducing Llamatik: Offline LLMs for Kotlin Multiplatform We’re thrilled to introduce Llamatik — an open-source Kotlin Multiplatform library that brings local Large Language Models (LLMs) to Android, iOS, desktop, and beyond using the power of llama.cpp. Llamatik makes it simple and efficient to integrate offline, on-device inference and embeddings into your KMP apps, whether you’re building an AI assistant, a RAG chatbot, or an edge intelligence tool. ⸻ ✨ Why Llamatik? Read more

All Posts

2025

  • Jul 22, 2025 - “Introducing Llamatik Offline LLMs for Kotlin Multiplatform”