Skip to content

inference.py file not called upon when deploying Tensorflow Model #1929

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ana-pcosta opened this issue Sep 29, 2020 · 1 comment
Closed

Comments

@ana-pcosta
Copy link

Describe the bug
I have trained a custom tensorflow model to perform semantic segmentation on images. I would like to perform inference by passing a jpg file to the endpoint using 64base encoding. The model signature def is as described in https://github.com/aws/sagemaker-tensorflow-serving-container:

signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['image_bytes'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_tensor:0

And the inference.py is saved in the model.tar.gz file in this structure:
model
|--1
|--variables
|--saved_model.pb
code
|--inference.py
|--requirements.txt

But whenever I try to do predictor.content_type = 'application/x-image' and predictor.predict(img), where img is a bytearray of a jpg file, this error pops up:

TypeError Traceback (most recent call last)
in

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/site-packages/sagemaker/tensorflow/serving.py in predict(self, data, initial_args)
118 args["CustomAttributes"] = self._model_attributes
119
--> 120 return super(Predictor, self).predict(data, args)
121
122

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/site-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model, target_variant)
110 """
111
--> 112 request_args = self._create_request_args(data, initial_args, target_model, target_variant)
113 response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
114 return self._handle_response(response)

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/site-packages/sagemaker/predictor.py in _create_request_args(self, data, initial_args, target_model, target_variant)
153
154 if self.serializer is not None:
--> 155 data = self.serializer(data)
156
157 args["Body"] = data

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/site-packages/sagemaker/predictor.py in call(self, data)
545 return _json_serialize_from_buffer(data)
546
--> 547 return json.dumps(_ndarray_to_list(data))
548
549

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/json/init.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
229 cls is None and indent is None and separators is None and
230 default is None and not sort_keys and not kw):
--> 231 return _default_encoder.encode(obj)
232 if cls is None:
233 cls = JSONEncoder

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/json/encoder.py in encode(self, o)
197 # exceptions aren't as detailed. The list call should be roughly
198 # equivalent to the PySequence_Fast that ''.join() would do.
--> 199 chunks = self.iterencode(o, _one_shot=True)
200 if not isinstance(chunks, (list, tuple)):
201 chunks = list(chunks)

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/json/encoder.py in iterencode(self, o, _one_shot)
255 self.key_separator, self.item_separator, self.sort_keys,
256 self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)
258
259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

~/anaconda3/envs/tensorflow2_p36/lib/python3.6/json/encoder.py in default(self, o)
178 """
179 raise TypeError("Object of type '%s' is not JSON serializable" %
--> 180 o.class.name)
181
182 def encode(self, o):

TypeError: Object of type 'bytearray' is not JSON serializable

System information
A description of your system. Please provide:

  • Tensorflow 2.0.0:
  • Python 3.6:
  • GPU:
  • Tensorflow2 docker image:
@ajaykarpur
Copy link
Contributor

Hi @ana-pcosta, please refer to the documentation on serializers: https://sagemaker.readthedocs.io/en/stable/api/inference/serializers.html

Please feel free to re-open this issue if you have further questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants