TensorFlow Lite
TensorFlow Lite is an open source deep learning framework for on-device inference. Articles in this section cover deploying machine learning models on mobile and IoT devices.
How-to Get Started with Machine Learning on Arduino
Community · TensorFlow Lite
How-to Get Started with Machine Learning on Arduino

A guest post by Sandeep Mistry & Dominic Pajak of the Arduino team

Arduino is on a mission to make Machine Learning simple enough for anyone to use. We’ve been working with the TensorFlow Lite team over the past few months and are excited to show you what we’ve been up to together: bringing TensorFlow Lite Micro to the Arduino Nano 33 BLE Sense. In this article, we’ll show you how to install a…

How vFlat used the TFLite GPU delegate for real time inference to scan books
Community · TensorFlow Lite
How vFlat used the TFLite GPU delegate for real time inference to scan books

A guest post by Kunwoo Park, Moogung Kim, Eunsung Han

Track human poses in real-time on Android with TensorFlow Lite
Google Article
Community · TensorFlow Lite
Track human poses in real-time on Android with TensorFlow Lite

Posted by Eileen Mao and Tanjin Prity, Engineering Practicum Interns at Google, Summer 2019

We are excited to release a TensorFlow Lite sample application for human pose estimation on Android using the PoseNet model. PoseNet is a vision model that estimates the pose of a person in an image or video by detecting the positions of key body parts. As an example, the model can estimate the position of…

TensorFlow Model Optimization Toolkit — float16 quantization halves model size
Google Article
TensorFlow Lite
TensorFlow Model Optimization Toolkit — float16 quantization halves model size

Posted by the TensorFlow team

We are very excited to add post-training float16 quantization as part of the Model Optimization Toolkit. It is a suite of tools that includes hybrid quantization, full integer quantization, and pruning. Check out what else is on the roadmap.

SmileAR: iQIYI’s Mobile AR solution based on TensorFlow Lite
Community · TensorFlow Lite
SmileAR: iQIYI’s Mobile AR solution based on TensorFlow Lite

A guest post by the SmileAR Engineering Team at iQIYI

Introduction: SmileAR is a TensorFlow Lite-based mobile AR solution developed by iQIYI. It has been deployed widely in iQIYI’s many applications, including the iQIYI flagship video app (100+ million DAU), Qibabu (popular app for children), Gingerbread (short video app) and more.

Build AI that works offline with Coral Dev Board, Edge TPU, and TensorFlow Lite
Google Article
TensorFlow Lite
Build AI that works offline with Coral Dev Board, Edge TPU, and TensorFlow Lite

Posted by Daniel Situnayake (@dansitu), Developer Advocate for TensorFlow Lite.

When you think about the hardware that powers machine learning, you might picture endless rows of power-hungry processors crunching terabytes of data in a distant server farm, or hefty desktop computers stuffed with banks of GPUs.

Air Cognizer: Predicting Air Quality with TensorFlow Lite
Google Article
Community · TensorFlow Lite
Air Cognizer: Predicting Air Quality with TensorFlow Lite

A guest article by Prerna Khanna, Tanmay Srivastava and Kanishk Jeet

TensorFlow Lite Now Faster with Mobile GPUs
Google Article
TensorFlow Lite
TensorFlow Lite Now Faster with Mobile GPUs

Posted by the TensorFlow team

Running inference on compute-heavy machine learning models on mobile devices is resource demanding due to the devices’ limited processing and power. While converting to a fixed-point model is one avenue to acceleration, our users have asked us for GPU support as an option to speed up the inference of the original floating point models without the extra complexity and…

Training and serving a realtime mobile object detector in 30 minutes with Cloud TPUs
Google Article
Community · TensorFlow Lite
Training and serving a realtime mobile object detector in 30 minutes with Cloud TPUs

Posted by Sara Robinson, Aakanksha Chowdhery, and Jonathan Huang

Using TensorFlow Lite on Android
Google Article
TensorFlow Lite
Using TensorFlow Lite on Android

Posted by Laurence Moroney, Developer Advocate
What is TensorFlow Lite?TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It lets you run machine-learned models on mobile devices with low latency, so you can take advantage of them to do classification, regression or anything else you might want without necessarily incurring a round trip to a server.