为解析TFRecord数据集而定义映射函数时遇到的问题。

问题描述 投票:0回答:1

我创建了ELWC格式的TensorFLow排名数据集,并将文件保存为.tfrecords文件,复制了get_descriptor_set()、decode_as_serialized_example_list()和parse()方法,并根据我的数据集修改了context_spec和example_spec。

如果我试着使用for循环来解析数据集,它就会工作。

raw_dataset = tf.data.TFRecordDataset(_FILE_NAME)
for r in raw_dataset:
features = parse(r, context_spec, example_feature_spec)
print(features)

Output:{'NIA_Paid_Status': , '金额': , 'Buyerid': , 'NIA_Commission': : , 'NIA_Commission': , 'rank': }

问题是

kwargs = {
      "context_spec": context_spec,
      "example_feature_spec": example_feature_spec,
  }
parsing_func = functools.partial(parse, **kwargs)
raw_dataset = tf.data.TFRecordDataset(_FILE_NAME).map(parsing_func)

导致警告tensorflow:Entity >不能被转换,将按原样执行。请将此情况报告给AutoGraph团队。提交错误时,将verbosity设置为10(在Linux上,导出AUTOGRAPH_VERBOSITY=10),并附上完整的输出。原因是 LIVE_VARS_IN和错误:TypeError: in converted code:TypeError: in converted code:

<ipython-input-15-8800217301d4>:31 parse  *
    features[k] = utils.reshape_first_ndims(v, 1, [batch_size, cur_list_size])
C:\Users\hp\Anaconda3\lib\site-packages\tensorflow_ranking\python\utils.py:173 reshape_first_ndims  *
    new_shape = tf.concat([new_shape, tf.shape(input=tensor)[first_ndims:]], 0)
C:\Users\hp\Anaconda3\lib\site-packages\tensorflow_core\python\util\dispatch.py:180 wrapper
    return target(*args, **kwargs)
C:\Users\hp\Anaconda3\lib\site-packages\tensorflow_core\python\ops\array_ops.py:1431 concat
    return gen_array_ops.concat_v2(values=values, axis=axis, name=name)
C:\Users\hp\Anaconda3\lib\site-packages\tensorflow_core\python\ops\gen_array_ops.py:1256 concat_v2
    "ConcatV2", values=values, axis=axis, name=name)
C:\Users\hp\Anaconda3\lib\site-packages\tensorflow_core\python\framework\op_def_library.py:499 _apply_op_helper
    raise TypeError("%s that don't all match." % prefix)

TypeError: Tensors in list passed to 'values' of 'ConcatV2' Op have types [<NOT CONVERTIBLE TO TENSOR>, int32] that don't all match.
python tensorflow-datasets functools
1个回答
© www.soinside.com 2019 - 2024. All rights reserved.