https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjV70Hf00h_hYouCjdgvlddq67AXQ8cAc94jTkPG6FG7zYvlRc9afWHR-8Z9reDpmb_IJrkPoNS2FYS0rxZSmW07lKEYAIXsYo_McKtHS_AU3KSLnzS3puy8RsokLLUqpOQD_q9r0gkSTs/s1600/posenetgif.gif
Posted by Yannick Assogba, Software Engineer, Google Research, Brain team
We are pleased to announce that
TensorFlow.js for React Native is now available for general use. We would like to thank everyone who gave us feedback, bug reports, and contributions during the alpha release and invite the broader community of React Native developers to try it out!
What is React Native?
JavaScript runs on a wide variety of platforms including native mobile apps.
Hybrid app frameworks allow developers to leverage JavaScript to develop native Android and iOS applications from a single codebase. These frameworks allow developers to make apps that live outside the browser and leverage native operating system APIs to create apps that integrate seamlessly with the underlying platform.
React Native is one of the most popular hybrid native frameworks and brings together the React framework for authoring and native UI components for rendering. React Native widgets are native platform widgets that are controlled from a JavaScript thread. The framework provides a way to author applications and is responsible for communication between the JavaScript thread and native APIs.
tfjs-react-native
React native does not rely on a “Web View” for rendering, and we did not want to force developers to use a Web View in order to use TensorFlow.js. Because of that, we cannot depend on many of the Web Platform APIs we use in the browser. Thus we provide a new
platform integration and
backend suited to this environment. The tfjs-react-native package provides the following capabilities:
- GPU Accelerated backend: Just like in the browser, TensorFlow.js for React Native uses WebGL to provide GPU accelerated math operations. We leverage the expo-gl library which provides a WebGL compatible graphics context powered by OpenGL ES 3. This allows us to reuse our existing WebGL implementation in this new environment.
- Model Loading and Saving: We are able to load and execute all tensorflow.js models that we are able to execute in the browser. We also provide two new IOHandlers that allow loading models which are bundled with the app bundle itself (and thus do not require a remote network call). This also saves customized models to local storage.
- Training Support: Speaking of customized models, tfjs-react-native has full support for training and fine tuning models that TensorFlow.js supplies. You can customize models based on user data while keeping that data on the client device.
- Image & Video Handling: Utilities are provided for JPEG decoding (our first external PR!) and video handling. Handling real time video is particularly tricky to support because of the serialization penalty of moving data from native to JavaScript threads in React Native. We thus provide functionality to do image resizing on GPU before a tensor is created. This allows developers to reduce the data that needs to be transferred from the camera stream to a model for inference. We even provide a React higher-order-component to make this integration feel natural to React Native developers.
A few examples
Loading one of our hosted models
works exactly the same way it does in the browser. Here we run a prediction on an image that is bundled along with the app. The same could be done with images from the user’s photo collection.
import * as mobilenet from '@tensorflow-models/mobilenet';
import { fetch, decodeJpeg } from '@tensorflow/tfjs-react-native';
// Load mobilenet.
const model = await mobilenet.load();
// Get a reference to the bundled asset and convert it to a tensor
const image = require('./assets/images/catsmall.jpg');
const imageAssetPath = Image.resolveAssetSource(image);
const response = await fetch(imageAssetPath.uri, {}, { isBinary: true });
const imageData = await response.arrayBuffer();
const imageTensor = decodeJpeg(imageData);
const prediction = await model.classify(imageTensor);
// Use prediction in app.
setState({
prediction,
});
In addition to working with locally stored data, a native app can also
store models locally alongside the app assets. The code snippet below shows how one could load a custom model that is bundled into the final app build.
import * as tf from '@tensorflow/tfjs';
import * as mobilenet from '@tensorflow-models/mobilenet';
import { fetch, decodeJpeg, bundleResourceIO } from '@tensorflow/tfjs-react-native';
// Get reference to bundled model assets
const modelJson = require('../assets/model/burger_not_burger.json');
const modelWeights = require('../assets/model/burger_not_burger_weights.bin');
// Use the bundleResorceIO IOHandler to load the model
const model = await tf.loadLayersModel(
bundleResourceIO(modelJson, modelWeights));
// Load an image from the web
const uri = 'http://example.com/food.jpg';
const response = await fetch(uri, {}, { isBinary: true });
const imageData = await response.arrayBuffer();
const imageTensor = decodeJpeg(imageData);
const prediction = (await model.predict(imageTensor))[0];
// Use prediction in app
setState({
prediction,
});
Other platforms
JavaScript runs in a lot of places and powers a lot of frameworks. How can we enable tensorflow.js integrations with a wide variety of platforms and frameworks? We added the
Platform interface to facilitate porting TensorFlow.js to other platforms. This interface consists of only 4 functions that need to be defined for an underlying platform. Coupling that with one of our existing backends (JavaScript CPU, WebGL and more recently WASM) will often be enough to create a new platform integration. We hope that
this integration can provide a template for how different communities can achieve this with the platforms of their choice.
Feedback
We are excited to see what kinds of apps you build with this and look forward to hearing
feedback,
bug reports, and
code contributions! We would love to see this platform adaptor grow into an effort that is community driven as we think those who work day-to-day on this platform are in the best position to shape its future!