如何在一个图层模块中打包几个keras图层?

问题描述 投票:2回答:1

我必须从pytorch切换到keras,在pytorch中,我可以使用以下代码创建类似模块的层:

from pytorch import nn

class up(nn.Module):
def __init__(self, in_ch, out_ch):
    super(up, self).__init__()
    self.up = nn.Upsample(scale_factor=2)
    self.conv = nn.Conv2D(in_ch, out_ch)
    # !!!! here two layers packaged in one
def forward(self, x1, x2):
    x1 = self.up(x1)
    x = t.cat([x2, x1], dim=1)
    x = self.conv(x)
    return x

如何在keras中以类似模块的方式组织代码?

python machine-learning neural-network keras keras-layer
1个回答
1
投票

想通了一种方法就是做功能:

def double_conv(var1, input):
    x = k.layers.Conv2d(some_parameters) (input)
    x = k.layers.Conv2d(some_parameters) (x)
    x = k.layers.MaxPooling2d(some_parameters) (x)
    return x

但还有更多的“kerasic”方式呢?


编辑这是我想要使用像Keras层的功能,但如果有人会找到更好的方法来组织代码,那么我会欢迎任何想法

def conv_bn_relu(filters, kernel=(3,3)):
    def inside(x):
        x = Conv2D(filters, kernel, padding='same') (x)
        x = BatchNormalization() (x)
        x = Activation('relu') (x)
        return x
    return inside

# usage:
x = conv_bn_relu(params) (x)

EDIT2 你甚至可以在类似CamelCase中欺骗和命名这个函数,所以看起来像创建像层一样的Keras

def ConvBnRelu(filters, kernel=(3,3)):
    def inside(x):
        x = Conv2D(filters, kernel, padding='same') (x)
        x = BatchNormalization() (x)
        x = Activation('relu') (x)
        return x
    return inside

# usage:
x = ConvBnRelu(params) (x)

但可能第二种解决方案会受到批评

© www.soinside.com 2019 - 2024. All rights reserved.