Skip to content

Access Model in sagemaker from AWSLambda #189

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Sandy26 opened this issue May 17, 2018 · 2 comments
Closed

Access Model in sagemaker from AWSLambda #189

Sandy26 opened this issue May 17, 2018 · 2 comments

Comments

@Sandy26
Copy link

Sandy26 commented May 17, 2018

Hi,

I am new to Amazon sagemaker and I am trying to build, deploy a model and then invoke it from AWS Lambda

System Information

  • Framework (e.g. TensorFlow) / Algorithm (e.g. KMeans): Tensorflow estimator with DNN Classifier
  • Python Version: 3.6
  • Python SDK Version: Using Sagemaker Console

Describe the problem

I followed the steps in the sample code for iris_dnn_classifier that comes with jupiter notebook instance in sagemaker. Here is the code for reference-
I have iris_dnn_classifier.py file-
import numpy as np
import os
import tensorflow as tf

def estimator(model_path, hyperparameters):
feature_columns = [tf.feature_column.numeric_column(INPUT_TENSOR_NAME, shape=[4])]
return tf.estimator.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=3,
model_dir=model_path)
def train_input_fn(training_dir, hyperparameters):
training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
filename=os.path.join(training_dir, 'iris_training.csv'),
target_dtype=np.int,
features_dtype=np.float32)

return tf.estimator.inputs.numpy_input_fn(
    x={INPUT_TENSOR_NAME: np.array(training_set.data)},
    y=np.array(training_set.target),
    num_epochs=None,
    shuffle=True)()

def serving_input_fn(hyperparameters):
feature_spec = {INPUT_TENSOR_NAME: tf.FixedLenFeature(dtype=tf.float32, shape=[4])}
return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()


Then I created the estimater and deployed the model as follows-
from sagemaker.tensorflow import TensorFlow

iris_estimator = TensorFlow(entry_point='iris_dnn_classifier.py',
role=role,
output_path=model_artifacts_location,
code_location=custom_code_upload_location,
train_instance_count=1,
train_instance_type='ml.c4.xlarge',
training_steps=1000,
evaluation_steps=100)

import boto3
region = boto3.Session().region_name
train_data_location = 's3://sagemaker-sample-data-{}/tensorflow/iris'.format(region)

iris_estimator.fit(train_data_location)

irispredictor = iris_estimator.deploy(initial_instance_count=1,
instance_type='ml.m4.xlarge')

All this in done in Sagemaker console jupiter instance. I also tried to check is the model works by-
irispredictor.predict([6.4, 3.2, 4.5, 1.5])
and it works fine.

Then I separately created an endpoint called "irispredict" to use in AWS Lambda.Now I am trying to call it from AWS lambda console by doing-
import boto3
import json
sagemaker = boto3.client('runtime.sagemaker')

def lambda_handler(event, context):
data = {'key':'[6.4, 3.2, 4.5, 1.5]'}
result = sagemaker.invoke_endpoint(EndpointName='irispredict',Body=json.dumps(data))
print(result)

I get the Error-
"errorMessage": "An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (500) from model

Minimal repro / logs

When I looked at the clockwatch logs, I see the following-
[2018-05-17 10:56:10,063] ERROR in serving: AbortionError(code=StatusCode.INVALID_ARGUMENT, details="Name: , Feature: inputs (data type: float) is required but could not be found.
#11 [[Node: ParseExample/ParseExample = ParseExample[Ndense=1, Nsparse=0, Tdense=[DT_FLOAT], dense_shapes=[[4]], sparse_types=[], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_example_tensor_0_0, ParseExample/ParseExample/names, ParseExample/ParseExample/dense_keys_0, ParseExample/Const)]]")
.

So I am guessing there is problem is in the input "data" variable?

Any suggestion/pointers are greatly appreciated

Thank you,
Sandy

@andremoeller
Copy link
Contributor

andremoeller commented May 17, 2018

Hi @Sandy26 ,

I believe you're right, that data isn't formatted as the TensorFlow container expects. Passing in just the list [6.4, 3.2, 4.5, 1.5] worked for me on an endpoint hosting the iris model:

import json
import boto3
sagemaker = boto3.client('runtime.sagemaker')
data = [6.4, 3.2, 4.5, 1.5]
response = sagemaker.invoke_endpoint(EndpointName="my-iris-endpoint", Body=json.dumps(data))
print(response['Body'].read())

{
  "result": {
    "classifications": [
      {
        "classes": [
          {
            "score": 3.8717451388947666e-05, 
            "label": "0"
          }, 
          {
            "score": 0.9991493225097656, 
            "label": "1"
          }, 
          {
            "score": 0.0008119273115880787, 
            "label": "2"
          }
        ]
      }
    ]
  }
}

Let me know if you still have any question, and feel free to reopen this. Thanks!

@Sandy26
Copy link
Author

Sandy26 commented May 18, 2018

Thank you @andremoeller . That worked! I now have trouble reading the response['Body'] into JSON :) but I have opened a separate thread for that! Thank you once again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants