Topic
#edge-ai
3 articles exploring edge-ai. Expert insights and analysis from our editorial team.
Showing 1–3 of 3 articles
Articles
Newest first
AI Infrastructure
Google LiteRT: Running LLMs on Your Phone Without the Cloud
Google's LiteRT (formerly TensorFlow Lite) now powers on-device LLM inference across Android, iOS, and desktop, delivering up to 11,000+ tokens per second.
AI Infrastructure
Google LiteRT: Running LLMs on Your Phone Without the Cloud
Google's LiteRT (formerly TensorFlow Lite) is now the production backbone for on-device GenAI across Android, Chrome, and Pixel devices. Here's what it means for developers building AI apps that run privately, without the cloud.
AI Tools
Off Grid AI: Running LLMs Completely Offline on Your Phone
A new wave of open-source tools enables large language models, image generation, and vision AI to run entirely offline on mobile devices, fundamentally changing privacy, accessibility, and cost dynamics for AI users worldwide.