为什么在连接我的U-Net代码时会收到以下错误ValueError?

问题描述 投票:0回答:1

在以下代码中,我开发了用于图像分割的3D U-Net。

IMG_HEIGHT = 1600
IMG_DEPTH = 345
IMG_CHANNELS = 1
x_train = np.array([img1, img2],  dtype=np.uint8)
y_train = np.array([mask1, mask2], dtype=np.bool)

# Contraction path
c1 = tf.keras.layers.Conv3D(64, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(s)
c1 = tf.keras.layers.BatchNormalization(axis=-1)(c1)
c1 = tf.keras.layers.Conv3D(64, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c1)
p1 = tf.keras.layers.MaxPooling3D((2, 2, 2))(c1)

c2 = tf.keras.layers.Conv3D(64, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(p1)
c2 = tf.keras.layers.BatchNormalization(axis=-1)(c2)
c2 = tf.keras.layers.Conv3D(128, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c2)
p2 = tf.keras.layers.MaxPooling3D((2, 2, 2))(c2)

c3 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(p2)
c3 = tf.keras.layers.BatchNormalization(axis=-1)(c3)
c3 = tf.keras.layers.Conv3D(512, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c3)
p3 = tf.keras.layers.MaxPooling3D((2, 2, 2))(c3)

c4 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(p3)
c4 = tf.keras.layers.BatchNormalization(axis=-1)(c4)
c4 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c4)

# Expansive path

u6 = tf.keras.layers.Conv3DTranspose(256, (3, 3, 3), strides=(2, 2, 2), padding='same')(c4)
u6 = tf.keras.layers.concatenate([u6, c3])
c6 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(u6)
c6 = tf.keras.layers.BatchNormalization(axis=-1)(c6)
c6 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c6)

u7 = tf.keras.layers.Conv3DTranspose(128, (3, 3, 3), strides=(2, 2, 2), padding='same')(c6)
u7 = tf.keras.layers.concatenate([u7, c2])
c7 = tf.keras.layers.Conv3D(128, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(u7)
c7 = tf.keras.layers.BatchNormalization(axis=-1)(c7)
c7 = tf.keras.layers.Conv3D(128, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c7)

u8 = tf.keras.layers.Conv3DTranspose(64, (3, 3, 3), strides=(2, 2, 2), padding='same')(c7)
u8 = tf.keras.layers.concatenate([u8, c1])
c8 = tf.keras.layers.Conv3D(64, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(u8)
c8 = tf.keras.layers.BatchNormalization(axis=-1)(c8)
c8 = tf.keras.layers.Conv3D(64, (3, 3, 3), activation='relu', kernel_initializer='he_normal', padding='same')(c8)


outputs = tf.keras.layers.Conv3D(3, (1, 1, 1), activation='sigmoid')(c8)

model = tf.keras.Model(inputs=[inputs], outputs=[outputs])
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.summary()

我收到此行的错误:u6 = tf.keras.layers.concatenate([u6, c3])

以下是我收到的ValueError:

Traceback (most recent call last):
   u6 = tf.keras.layers.concatenate([u6, c3])
   ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 400, 424, 86, 256), (None, 400, 425, 86, 512)]

我不知道我是否弄乱了c3或u6的形状,以及如何解决此问题。请说明如何解决此错误。

python deep-learning concatenation conv-neural-network valueerror
1个回答
0
投票

从您的错误日志中,u6形状为(..., 424, ...),而c3为(..., 425, ...)

[这是合并的结果,通过最大池化425x425输入(p3),您得到212x212,但是上采样(第一个u6)将只给您424x424。

因此,您只需要确保图像的宽度,高度和深度可被8次或2次幂除以使用池的次数。

这次我想您有1700x1600的图片?也许像x_train = x_train[:, :1600, :1680]应该起作用那样裁剪它。

© www.soinside.com 2019 - 2024. All rights reserved.