Using TensorFlow Lite on Android
March 30, 2018
Posted by Laurence Moroney, Developer Advocate

What is TensorFlow Lite?

TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It lets you run machine-learned models on mobile devices with low latency, so you can take advantage of them to do classification, regression or anything else you might want without necessarily incurring a round trip to a server.

It’s presently supported on Android and iOS via a C++ API, as well as having a Java Wrapper for Android Developers. Additionally, on Android Devices that support it, the interpreter can also use the Android Neural Networks API for hardware acceleration, otherwise it will default to the CPU for execution. In this article I’ll focus on how you use it in an Android app.

TensorFlow Lite is comprised of a runtime on which you can run pre-existing models, and a suite of tools that you can use to prepare your models for use on mobile and embedded devices

It’s not yet designed for training models. Instead, you train a model on a higher powered machine, and then convert that model to the .TFLITE format, from which it is loaded into a mobile interpreter.
TensorFlow Lite architecture
TensorFlow Lite is presently in developer preview, so it may not support all operations in all TensorFlow models. Despite this, it does work with common Image Classification models including Inception and MobileNets. In this article you’ll look at running a MobileNet model on Android. The app will look at the camera feed and use the trained MobileNet to classify the dominant images in the picture.

Using TensorFlow Lite with MobileNets

For example, in this image I pointed the camera at my favorite coffee mug, and saw that it was primarily classified as a ‘cup’, and given its shape it’s easy to understand why! It’s also interesting that it has a large, wide, handle which you can see is very teapot-like!
classifying coffee mug
So how does this work? It’s using a MobileNet model, which is designed and optimized for a number of image scenarios on mobile, including Object Detection, Classification, Facial Attribute detection and Landmark recognition.
MobileNets
There are a number of variants of MobileNet, with trained models for TensorFlow Lite hosted at this site. You’ll notice that each one is a zip file containing two files — a labels.txt file that contains the labels that the model is trained for and a .tflite file that contains a version of the model that is ready to be used with TensorFlow Lite. If you want to follow along and build an Android app that uses MobileNets you’ll need to download a model from this site. You’ll see that in a moment.

You can learn more about TensorFlow Lite in this Video:
Introducing TensorFlow Lite — Coding TensorFlow

Building an Android App to use TensorFlow Lite

To build an Android App that uses TensorFlow Lite, the first thing you’ll need to do is add the tensorflow-lite libraries to your app. This can be done by adding the following line to your build.gradle file’s dependencies section:
compile ‘org.tensorflow:tensorflow-lite:+’
Once you’ve done this you can import a TensorFlow Lite interpreter. An Interpreter loads a model and allows you to run it, by providing it with a set of inputs. TensorFlow Lite will then execute the model and write the outputs, it’s really as simple as that.
import org.tensorflow.lite.Interpreter;
To use it you create an instance of an Interpreter, and then load it with a MappedByteBuffer.
protected Interpreter tflite;
tflite = new Interpreter(loadModelFile(activity));
There’s a helper function for this in the TensorFlow Lite sample on GitHub. Just ensure that getModelPath() returns a string that points to a file in your assets folder, and the model should load.
/** Memory-map the model file in Assets. */
private MappedByteBuffer loadModelFile(Activity activity) throws IOException {
  AssetFileDescriptor fileDescriptor = activity.getAssets().openFd(getModelPath());
  FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
  FileChannel fileChannel = inputStream.getChannel();
  long startOffset = fileDescriptor.getStartOffset();
  long declaredLength = fileDescriptor.getDeclaredLength();
  return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}
Then, to classify an image, all you need to do is call the run method on the Interpeter, passing it the image data and the labels array, and it will do the rest:
tflite.run(imgData, labelProbArray);
Going into detail on how to grab the image from the camera, and to prepare it for tflite is beyond the scope of this post, but there’s a full sample on how to do it in the tensorflow github. By stepping through this sample you can see how it grabs from the gamera, prepares the data for classification, and handles the output by mapping the weighted output priority list from the model to the labels array.

You can learn more about building an Android app that uses TensorFlow Lite in this video:
TensorFlow Lite for Android — Coding TensorFlow
Getting and Running the Android Sample

To run the sample, make sure you have the full TensorFlow source. You can get it using
> git clone https://www.github.com/tensorflow/tensorflow
Once you’ve done that, you can open the TensorFlow sample project from the /tensorflow/contrib/lite/java/demo folder in Android Studio:

The demo file does not include any models, and it expects the mobilenet_quant_v1_224.tflite file, so be sure to download the model from this site. Unzip it and put it in the assets folder.
You should now be able to run the app.

Note that the app potentially supports both Inception and the Quantized MobileNet. It defaults to the latter, so you need to make sure you have the model present, or the app will fail! The code for capturing data from the camera and converting it into a byte buffer for loading into the model can be found in the ImageClassifier.java file.

The core of the functionality can be found in the classifyFrame() method in the Camera2BasicFragment.java file:
/** Classifies a frame from the preview stream. */
private void classifyFrame() {
  if (classifier == null || getActivity() == null || cameraDevice ==   null) {
  showToast(“Uninitialized Classifier or invalid context.”)
  return;
}
Bitmap bitmap = textureView.getBitmap(
  classifier.getImageSizeX(), classifier.getImageSizeY());
String textToShow = classifier.classifyFrame(bitmap);
bitmap.recycle();
showToast(textToShow);
}
Here you can see the bitmap is loaded and sized to the appropriate size for the classifier. The classifyFrame() method will then return text containing a list of the top 3 classes that match the image along with their weights.

TensorFlow Lite is still evolving, and you can keep track of it at https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/lite

Next post
Using TensorFlow Lite on Android

Posted by Laurence Moroney, Developer Advocate
What is TensorFlow Lite?TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It lets you run machine-learned models on mobile devices with low latency, so you can take advantage of them to do classification, regression or anything else you might want without necessarily incurring a round trip to a server.

It’s presentl…