PyTorch:如何在层之间添加任意函数?

问题描述 投票:0回答:1

!!!我刚刚开始了解 PyTorch !!!

假设模型具有以下架构:

(conv1): Conv2d(2, 6, kernel_size=(5, 5), stride=(1, 1))
(pool): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))
(fc1): Linear(in_features=256, out_features=120, bias=True)
(fc2): Linear(in_features=120, out_features=84, bias=True)
(fc3): Linear(in_features=84, out_features=10, bias=True)

例如,我应该如何在 conv1 层和池层之间添加一些 MyFunction?

这是我当前的代码:

class CNN(Module):
    def __init__(self) -> None:
        super(CNN, self).__init__()
        self.cnn_layer = Sequential(
            Conv2d(in_channels=2, out_channels=6, kernel_size=5),
            # MyFunction here
            ReLU(inplace=True),
            MaxPool2d(kernel_size=2, stride=2),
        )
        self.linear_layers = Sequential(
            Linear(256, 120), Linear(120, 84), Linear(84, 10)
        )

    def forward(self, image):
        image = self.cnn_layer(image)

        image = image.view(-1, 4 * 4 * 16)
        image = self.linear_layers(image)
        return image
python pytorch torch
1个回答
0
投票

注意,

Sequential
层只是将多个前馈层捆绑为“一个”的一种方法。这意味着,您不需要显式地将数据传递到每一层(与我下面所做的相反)。我重写了你的例子,没有
Sequential
层,这样你就可以看到下面发生了什么。这样做可以轻松访问层输出/输入并根据您的需要更改它们。当然,您可以重新排列您的
Sequential
捆绑包以在需要访问“功能”的
x
的位置进行拆分。

class CNN(Module):
    def __init__(self) -> None:
        super(CNN, self).__init__()
        self.conv1 = Conv2d(in_channels=2, out_channels=6, kernel_size=5)
        self.relu1 = ReLU(inplace=True)
        self.maxpool1 = MaxPool2d(kernel_size=2, stride=2)
        self.flatten = Flatten()
        self.linear1 = Linear(256, 120)
        self.linear2 = Linear(120, 84)
        self.linear3 = Linear(84, 10)

    def forward(self, image):
        x = self.conv1(image)
        x = x * 2 - 123  # arbitrary stuff
        x = self.relu1(x)
        x = self.maxpool(x)
        x = self.flatten(x)  # shorter than your reshaping
        x = linear1(x)
        x = linear2(x)
        x = linear3(x)
        return x
© www.soinside.com 2019 - 2024. All rights reserved.