-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JSON examples for SageMaker / TF serving #18
Comments
Hi @nkconnor, Thanks for your report, I've confirmed that it is a bug. I'm investigating it and will give you a better update tomorrow. |
The container source code is not open sourced yet, we are planning on releasing it very soon. I have a temporary solution to unblock you until this issue is fixed. The solution is to create a instance of a from sagemaker.tensorflow.predictor import tf_serializer, tf_deserializer
from tensorflow.python.saved_model.signature_constants import DEFAULT_SERVING_SIGNATURE_DEF_KEY
from tensorflow_serving.apis import classification_pb2
def _create_feature(v):
if type(v) == int:
return tf.train.Feature(int64_list=tf.train.Int64List(value=[v]))
if type(v) == str:
return tf.train.Feature(bytes_list=tf.train.BytesList(value=[v]))
if type(v) == float:
return tf.train.Feature(float_list=tf.train.FloatList(value=[v]))
raise ValueError('invalid type')
endpoint = 'my-endpoint'
predictor = RealTimePredictor(endpoint=endpoint,
deserializer=tf_deserializer,
serializer=tf_serializer,
content_type='application/octet-stream')
data = {'age': 39., 'workclass': 'Private', 'fnlwgt': 77516, 'education': 'Bachelors',
'education_num': 13., 'marital_status': 'Never-married', 'occupation': 'Adm-clerical',
'relationship': 'Husband', 'race': '', 'gender': '', 'capital_gain': 2174., 'capital_loss': 0.,
'hours_per_week': 40., 'native_country': '', 'income_bracket': '<=50K'}
features = {k: _create_feature(v) for k, v in data.items()}
request = classification_pb2.ClassificationRequest()
request.model_spec.name = "generic_model"
request.model_spec.signature_name = DEFAULT_SERVING_SIGNATURE_DEF_KEY
example = tf.train.Example(features=tf.train.Features(feature=features))
request.input.example_list.examples.extend([example])
print(predictor.predict(request)) The |
@mvsusp thanks for jumping on this and the workaround example using protos. On a related note - is there any examples for the same use case but with a From what I can tell, it does not use a header column.
Wondering if perhaps this functionality is not implemented yet on the proxy side. LMK if you would rather me open a new issue. |
@nkconnor |
This was now fixed in #62, which was released as part of 1.1.0 today. |
What format can we use to send
predict
requests using the Sagemaker SDK for input functions like the above?The JSON serializer only handles arrays.. so it seems like
tf_estimator.predict({"city":"Paris", "gender":"m", "age":22})
is out. I tried variations of Array input and get cryptic errors from the TF serving proxy client (that source code is not available to my knowledge)Looking at the TF Iris DNN example notebook: it uses a syntax like
iris_predictor.predict([6.4, 3.2, 4.5, 1.5])
though the FeatureSpec is like{'input': IrisArrayData}
. So perhaps the feature spec needs a top level?The text was updated successfully, but these errors were encountered: