如何加速将tensorflow_datasets中的张量转换为numpy数组的代码?

问题描述 投票:1回答:1

尽管我想将张量转换为tensorflow_datasets中的numpy数组,但是我的代码却越来越慢。现在,我使用lsun / bedroom数据集,其中包含超过300万张图像。如何加速我的代码?

我的代码保存每100,000张图像具有numpy数组的元组。

train_tf = tfds.load("lsun/bedroom", data_dir="{$my_directory}", download=False)
train_tf = train_tf["train"]
for data in train_tf:
    if d_cnt==0 and d_cnt%100001==0:
        train = (tfds.as_numpy(data["image"]), )
    else:
        train += (tfds.as_numpy(data["image"]), )

    if d_cnt%100000==0 and d_cnt!=0:
        with open("{$my_directory}/lsun.pickle%d"%(d_cnt), "wb") as f:
            pickle.dump(train, f)

    d_cnt += 1
python numpy tensorflow tensor tensorflow-datasets
1个回答
© www.soinside.com 2019 - 2024. All rights reserved.