avril 10, 2020 —
                                          
Posted by Sara Robinson, Developer Advocate
Google Cloud’s AI Platform recently added support for deploying TensorFlow 2 models. This lets you  scalably serve predictions to end users without having to manage your own infrastructure. In this post, I’ll walk you through the process of deploying two different types of TF2 models to AI Platform and use them to generate predictions with the AI Platfo…

probability_model created at the end of this notebook, since it outputs classifications in a more human-readable format. The output of probability_model is a 10-element softmax array with the probabilities that the given image belongs to each class. Since it’s a softmax array, all of the elements add up to 1. The highest-confidence classification will be the item of clothing corresponding with the index with the highest value.from google.colab import auth
auth.authenticate_user()CLOUD_PROJECT = 'your-project-id-here'
BUCKET = 'gs://' + CLOUD_PROJECT + '-tf2-models'!gcloud config set project $CLOUD_PROJECT!gsutil mb $BUCKET
print(BUCKET)model.save() method accepts a GCS bucket URL. We’ll save our model assets into a fashion-mnist subdirectory:probability_model.save(BUCKET + '/fashion-mnist', save_format='tf')
MODEL = 'fashion_mnist'
!gcloud ai-platform models create $MODEL --regions=us-central1
VERSION = 'v1'
MODEL_DIR = BUCKET + '/fashion-mnist'!gcloud ai-platform versions create $VERSION \
  --model $MODEL \
  --origin $MODEL_DIR \
  --runtime-version=2.1 \
  --framework='tensorflow' \
  --python-version=3.7
import googleapiclient.discovery
def predict_json(project, model, instances, version=None):
    service = googleapiclient.discovery.build('ml', 'v1')
    name = 'projects/{}/models/{}'.format(project, model)
    if version is not None:
        name += '/versions/{}'.format(version)
    response = service.projects().predict(
        name=name,
        body={'instances': instances}
    ).execute()
    if 'error' in response:
        raise RuntimeError(response['error'])
    return response['predictions']test_predictions = predict_json(CLOUD_PROJECT, MODEL, test_images[:2].tolist())softmax probability list as the value. We can get the predicted class of the first test image by running:np.argmax(test_predictions[0]['softmax'])plt.figure()
plt.imshow(test_images[0])
plt.colorbar()
plt.grid(False)
plt.show()
feature_columns. This is the input format our model is expecting, which will come in handy after we deploy it. In addition to sending features as tensors, we can also send them to our deployed model as lists. Note that this model has a mix of numerical and categorical features. One of the categorical features (thal) should be passed in as a string; the rest are either integers or floats.hd-prediction subdirectory:model.save(BUCKET + '/hd-prediction', save_format='tf')
|  | 
| Head over to the models section of your Cloud console. Then select the New model button and give your model a name, like hd_predictionand select Create.Once your model resource has been created, select New version. Give it a name (like v1), then select the most recent Python version (3.7 at the time of this writing). Under frameworks select TensorFlow with Framework version 2.1 and ML runtime version 2.1. In Model URL, enter the Cloud Storage URL where you uploaded your TF SavedModel earlier. This should be equivalent toBUCKET + '/hd-prediction'if you followed the steps above. Then select Save, and when your model is finished deploying you’ll see a green checkmark next to the version name in your console. | 
# First remove the label column
test = test.pop('target') 
caip_instances = []
test_vals = test[:2].values
for i in test_vals:
    example_dict = {k: [v] for k,v in zip(test.columns, i)}
    caip_instances.append(example_dict)caip_instances looks like:[{'age': [60],
  'ca': [2],
  'chol': [293],
  'cp': [4],
  'exang': [0],
  'fbs': [0],
  'oldpeak': [1.2],
  'restecg': [2],
  'sex': [1],
  'slope': [2],
  'thal': ['reversible'],
  'thalach': [170],
  'trestbps': [140]},
...]predict_json method we defined above, passing it our new model and test instances:test_predictions = predict_json(CLOUD_PROJECT, 'hd_prediction', caip_instances)[{'output_1': [-1.4717596769332886]}, {'output_1': [-0.2714746594429016]}]output_1), you can add a name parameter when you define your Keras model in the tutorial above:layers.Dense(1, name='prediction_probability') 
avril 10, 2020
 —
                                  
Posted by Sara Robinson, Developer Advocate
Google Cloud’s AI Platform recently added support for deploying TensorFlow 2 models. This lets you  scalably serve predictions to end users without having to manage your own infrastructure. In this post, I’ll walk you through the process of deploying two different types of TF2 models to AI Platform and use them to generate predictions with the AI Platfo…