On-device, real-time multimodal AI. Have natural voice and vision conversations with an AI that runs entirely on your machine. Powered by Gemma 4 E2B and Kokoro.
-
Updated
Apr 7, 2026 - HTML
On-device, real-time multimodal AI. Have natural voice and vision conversations with an AI that runs entirely on your machine. Powered by Gemma 4 E2B and Kokoro.
Run local LLMs like Gemma, Qwen, and LLaMA on Android for offline, private, real-time chat and question answering with LiteRT and ONNX Runtime.
Swift package for running LiteRT-LM models on iOS. Wraps Google's C API in a clean, async/await Swift interface.
Run LLM inference in an Android app with llama.cpp, ExecuTorch, LiteRT, ONNX, and more.
Edge Agent Lab is an Android testing platform for evaluating small language model (SLM) agents directly on mobile devices.
LiteRT-LM model inference support for Unity Android and Meta Quest apps.
Agent Skills for on-device ML: FunctionGemma fine-tuning, LiteRT-LM export, and more. Compatible with Claude Code, Codex, and Gemini CLI.
React Native module for on-device LLM inference using LiteRT/MediaPipe. Supports Gemma 3n and other compatible models with AI SDK compatible API
Offline Android object detection app using LiteRT (Google AI Edge). Pick any photo from gallery → instant bounding boxes with labels & confidence scores. Built with Jetpack Compose + Clean Architecture.
Android chat app that runs AI models directly on your phone - no internet, no cloud, nothing leaving your device, ensuring total privacy.
Rust bindings for LiteRT, successor to TensorFlow Lite. is Google's On-device framework for high-performance ML & GenAI deployment on edge platforms, via efficient conversion, runtime, and optimization
Offline Android speech translation app
Agente autônomo com LLM local inspirado no OpenClaw utilizando LiteRT-LM
AI Gateway/Hosting for LiteRT-LM on edge devices
Add a description, image, and links to the litert-lm topic page so that developers can more easily learn about it.
To associate your repository with the litert-lm topic, visit your repo's landing page and select "manage topics."