április 10, 2020 —
Posted by Sara Robinson, Developer Advocate
Google Cloud’s AI Platform recently added support for deploying TensorFlow 2 models. This lets you scalably serve predictions to end users without having to manage your own infrastructure. In this post, I’ll walk you through the process of deploying two different types of TF2 models to AI Platform and use them to generate predictions with the AI Platfo…
probability_model
created at the end of this notebook, since it outputs classifications in a more human-readable format. The output of probability_model
is a 10-element softmax array with the probabilities that the given image belongs to each class. Since it’s a softmax array, all of the elements add up to 1. The highest-confidence classification will be the item of clothing corresponding with the index with the highest value.from google.colab import auth
auth.authenticate_user()
Then run the following, replacing “your-project-id-here” with the ID of the Cloud project you created:CLOUD_PROJECT = 'your-project-id-here'
BUCKET = 'gs://' + CLOUD_PROJECT + '-tf2-models'
For the next few code snippets, we’ll be using gcloud: the Google Cloud CLI along with gsutil, the CLI for interacting with Google Cloud Storage. Run the line below to configure gcloud with the project you created:!gcloud config set project $CLOUD_PROJECT
In the next step, we’ll create a Cloud Storage bucket and print our GCS bucket URL. This will be used to store your saved model. You only need to run this cell once:!gsutil mb $BUCKET
print(BUCKET)
Cloud AI Platform expects our model in TensorFlow 2 SavedModel format. To export our model in this format to the bucket we just created, we can run the following command. The model.save()
method accepts a GCS bucket URL. We’ll save our model assets into a fashion-mnist
subdirectory:probability_model.save(BUCKET + '/fashion-mnist', save_format='tf')
To verify that this exported to your storage bucket correctly, navigate to your bucket in the Cloud Console (visit storage -> browser). You should see something like this:MODEL = 'fashion_mnist'
!gcloud ai-platform models create $MODEL --regions=us-central1
Once this runs, you should see the model in the Models section of the AI Platform Cloud Console:VERSION = 'v1'
MODEL_DIR = BUCKET + '/fashion-mnist'
Finally, run this gcloud command to deploy the model:!gcloud ai-platform versions create $VERSION \
--model $MODEL \
--origin $MODEL_DIR \
--runtime-version=2.1 \
--framework='tensorflow' \
--python-version=3.7
This command may take a minute to complete. When your model version is ready, you should see the following in the Cloud Console:import googleapiclient.discovery
def predict_json(project, model, instances, version=None):
service = googleapiclient.discovery.build('ml', 'v1')
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
if 'error' in response:
raise RuntimeError(response['error'])
return response['predictions']
We’ll start by sending two test images to our model for prediction. To do that, we’ll convert these images from our test set to lists (so it’s valid JSON) and send them to the method we’ve defined above along with our project and model:test_predictions = predict_json(CLOUD_PROJECT, MODEL, test_images[:2].tolist())
In the response, you should see a JSON object with softmax as the key, and a 10-element softmax
probability list as the value. We can get the predicted class of the first test image by running:np.argmax(test_predictions[0]['softmax'])
Our model predicts class 9 for this image with 98% confidence. If we look at the beginning of the notebook, we’ll see that 9 corresponds with ankle boot. Let’s plot the image to verify our model predicted correctly. Looks good!plt.figure()
plt.imshow(test_images[0])
plt.colorbar()
plt.grid(False)
plt.show()
feature_columns
. This is the input format our model is expecting, which will come in handy after we deploy it. In addition to sending features as tensors, we can also send them to our deployed model as lists. Note that this model has a mix of numerical and categorical features. One of the categorical features (thal
) should be passed in as a string; the rest are either integers or floats.hd-prediction
subdirectory:model.save(BUCKET + '/hd-prediction', save_format='tf')
Verify that the model assets were uploaded to your bucket. Since we showed how to deploy models with gcloud in the previous section, here we’ll use the Cloud Console. Start by selecting New Model in the Models section of AI Platform in the Cloud Console:Head over to the models section of your Cloud console. Then select the New model button and give your model a name, like hd_prediction and select Create. Once your model resource has been created, select New version. Give it a name (like v1 ), then select the most recent Python version (3.7 at the time of this writing). Under frameworks select TensorFlow with Framework version 2.1 and ML runtime version 2.1. In Model URL, enter the Cloud Storage URL where you uploaded your TF SavedModel earlier. This should be equivalent to BUCKET + '/hd-prediction' if you followed the steps above. Then select Save, and when your model is finished deploying you’ll see a green checkmark next to the version name in your console. |
# First remove the label column
test = test.pop('target')
caip_instances = []
test_vals = test[:2].values
for i in test_vals:
example_dict = {k: [v] for k,v in zip(test.columns, i)}
caip_instances.append(example_dict)
Here’s what the resulting array of caip_instances
looks like:[{'age': [60],
'ca': [2],
'chol': [293],
'cp': [4],
'exang': [0],
'fbs': [0],
'oldpeak': [1.2],
'restecg': [2],
'sex': [1],
'slope': [2],
'thal': ['reversible'],
'thalach': [170],
'trestbps': [140]},
...]
We can now call the same predict_json
method we defined above, passing it our new model and test instances:test_predictions = predict_json(CLOUD_PROJECT, 'hd_prediction', caip_instances)
Your response will look something like the following (exact numbers will vary):[{'output_1': [-1.4717596769332886]}, {'output_1': [-0.2714746594429016]}]
Note that if you’d like to change the name of the output tensor (currently output_1
), you can add a name
parameter when you define your Keras model in the tutorial above:layers.Dense(1, name='prediction_probability')
In addition to making predictions with the API, you can also make prediction requests with gcloud. All of the prediction requests we’ve made so far have used online prediction, but AI Platform also supports batch prediction for large offline jobs. To create a batch prediction job, you can make a JSON file of your test instances and kick off the job with gcloud. You can read more about batch prediction here.
április 10, 2020
—
Posted by Sara Robinson, Developer Advocate
Google Cloud’s AI Platform recently added support for deploying TensorFlow 2 models. This lets you scalably serve predictions to end users without having to manage your own infrastructure. In this post, I’ll walk you through the process of deploying two different types of TF2 models to AI Platform and use them to generate predictions with the AI Platfo…