You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It looks like it's an issue with the prediction format - I'll see if I can reproduce the issue locally myself to get a better idea of the problem, and report back with what I figure out.
Hi. I'm new to AWS sagemaker and build my custom tensorflow estimator with your tensorflow iris sample code.
I created own estimator, like this.
(without export_outputs, classifier.export_savedmodel cannot export saved model)
I exported trained model using this
Then I saved my model, created checkpoint and send query to it.
sample = np.arange(100).astype(np.int64).tolist() predictor.predict(sample)
I got error follows.
Error on Jupyter Notebook Console:
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (500) from model with message "". See https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer:group=/aws/sagemaker/Endpoints/sagemaker-tensorflow-py2-cpu-2018-02-01-17-06-45-306 in account 561830960602 for more information.
Error found on CloudWatch Management Console
[2018-02-01 17:21:08,384] ERROR in serving: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
Traceback (most recent call last):
File "/opt/amazon/lib/python2.7/site-packages/container_support/serving.py", line 161, in _invoke
self.transformer.transform(content, input_content_type, requested_output_content_type)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 255, in transform
return self.transform_fn(data, content_type, accepts), accepts
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 180, in f
prediction = self.predict_fn(input)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 195, in predict_fn
return self.proxy_client.request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 51, in request
return request_fn(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 77, in predict
request = self._create_predict_request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 94, in _create_predict_request
input_map = self._create_input_map(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 199, in _create_input_map
raise ValueError(msg.format(data))
ValueError: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
2018-02-01 17:21:08,384 ERROR - model server - Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
Traceback (most recent call last):
File "/opt/amazon/lib/python2.7/site-packages/container_support/serving.py", line 161, in _invoke
self.transformer.transform(content, input_content_type, requested_output_content_type)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 255, in transform
return self.transform_fn(data, content_type, accepts), accepts
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 180, in f
prediction = self.predict_fn(input)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 195, in predict_fn
return self.proxy_client.request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 51, in request
return request_fn(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 77, in predict
request = self._create_predict_request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 94, in _create_predict_request
input_map = self._create_input_map(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 199, in _create_input_map
raise ValueError(msg.format(data))
ValueError: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
[2018-02-01 17:21:08,384] ERROR in serving: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
2018-02-01 17:21:08,384 ERROR - model server - Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
10.32.0.2 - - [01/Feb/2018:17:21:08 +0000] "POST /invocations HTTP/1.1" 500 0 "-" "AHC/2.0"
I tried to send predict_pb2 object to model, but it failed.
The text was updated successfully, but these errors were encountered: