我是TFX的新手,我一直在关注Keras教程,并且已经成功地使用我的数据创建了TFX管道。当我学习通过TF Serving通过Docker服务模型时,我的数据输入必须按如下所示进行序列化以返回预测结果。
如何在不序列化数据输入的情况下将数据输入REST API。为此,我创建了第二个函数def_get_serve_raw。培训师已成功创建,但是我似乎无法通过REST API输入原始数据来调用它。我已经测试了多种格式,但是每次都有不同的错误。
我应该在def_get_served_raw函数中做什么,以便该模型接受不包含base64数据的数据输入?
FYI-模型有2个数据输入作为字符串。
下面是我在Trainer TFX中的def run_fn上的内容>
def _get_serve_tf_examples_fn(model, tf_transform_output): model.tft_layer = tf_transform_output.transform_features_layer() @tf.function def serve_tf_examples_fn(serialized_tf_examples): """Returns the output to be used in the serving signature.""" feature_spec = tf_transform_output.raw_feature_spec() feature_spec.pop(features.LABEL_KEY) parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec) transformed_features = model.tft_layer(parsed_features) transformed_features.pop(features.transformed_name(features.LABEL_KEY)) outputs = model(transformed_features) return {'outputs': outputs} return serve_tf_examples_fn def _get_serve_raw(model, transform_output): model.tft_layer = tf_transform_output.transform_features_layer() @tf.function def serve_raw_fn(country_code, project_type): country_code_sp_tensor = tf.sparse.SparseTensor( indices= [[0,0]], values= country_code, dense_shape= (1,1) ) project_type_sp_tensor = tf.sparse.SparseTensor( indices= [[0,0]], values= project_type, dense_shape= (1,1) ) parsed_features = {'Country_Code' : country_code_sp_tensor, 'Project_Type' : project_type_sp_tensor} transformed_features = model.tft_layer(parsed_features) transformed_features.pop(_transformed_name(_LABEL_KEY_EA)) outputs = model(transformed_features) return {'outputs': outputs} return serve_raw_fn signatures = { "serving_default": _get_serve_tf_examples_fn(model, tf_transform_output).get_concrete_function( tf.TensorSpec(shape=[None], dtype=tf.string, name='examples')), "serving_raw": _get_serve_raw(model, tf_transform_output).get_concrete_function( tf.TensorSpec(shape=[None], dtype=tf.string, name='country_code'), tf.TensorSpec(shape=(None), dtype=tf.string, name='project_type'))} model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures)
服务签名
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs: signature_def['__saved_model_init_op']: The given SavedModel SignatureDef contains the following input(s): The given SavedModel SignatureDef contains the following output(s): outputs['__saved_model_init_op'] tensor_info: dtype: DT_INVALID shape: unknown_rank name: NoOp Method name is: signature_def['serving_default']: The given SavedModel SignatureDef contains the following input(s): inputs['examples'] tensor_info: dtype: DT_STRING shape: unknown_rank name: serving_default_examples:0 The given SavedModel SignatureDef contains the following output(s): outputs['outputs'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: StatefulPartitionedCall:0 Method name is: tensorflow/serving/predict signature_def['serving_raw']: The given SavedModel SignatureDef contains the following input(s): inputs['raw'] tensor_info: dtype: DT_STRING shape: unknown_rank name: serving_raw_raw:0 The given SavedModel SignatureDef contains the following output(s): outputs['outputs'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: StatefulPartitionedCall_1:0 Method name is: tensorflow/serving/predict
TEST&ERROR 1:
url = f'http://localhost:8501/v1/models/ea:predict' headers = {"content-type": "application/json"} data = { "signature_name":"serving_raw", "instances":[ { "raw":{"countty_code": "US", "project_type": "Delivery"} } ] } data = json.dumps(data) print(data) json_response = requests.post(url, data=data, headers=headers) print(json_response.content) print(json_response.json) b'{\n "error": "Failed to process element: 0 key: raw of \'instances\' list. Error: Invalid argument: JSON Value: {\\n \\"country_code\\": \\"US\\",\\n \\"project_type\\": \\"Delivery\\"\\n} not formatted correctly for base64 data"\n}'
TEST&ERROR 2
url = f'http://localhost:8501/v1/models/ea:predict'
headers = {"content-type": "application/json"}
data = {
"signature_name":"serving_raw",
"instances":[
{
"raw":{"b64": "US",
"b64": "Delivery"}
}
]
}
data = json.dumps(data)
print(data)
json_response = requests.post(url, data=data, headers=headers)
print(json_response.content)
print(json_response.json)
b'{\n "error": "You must feed a value for placeholder tensor \'StatefulPartitionedCall_1/StatefulPartitionedCall/transform_features_layer_1/transform/transform/inputs/F_Project_Type/shape\' with dtype int64 and shape [2]\\n\\t [[{{node transform_features_layer_1/transform/transform/inputs/F_Project_Type/shape}}]]"\n}'
我是TFX的新手,我一直在关注Keras教程,并且已经成功地使用我的数据创建了TFX管道。当我学习使用TF Serving通过Docker服务模型时,我的...
再次重新测试后,测试1已成功执行。