如何更改Pytorch预训练模块中的激活层?

问题描述 投票:0回答:2

如何更改Pytorch预训练网络的激活层?这是我的代码:

print("All modules")
for child in net.children():
    if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU):
        print(child)

print('Before changing activation')
for child in net.children():
    if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU):
        print(child)
        child=nn.SELU()
        print(child)
print('after changing activation')
for child in net.children():
    if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU):
        print(child)

这是我的输出:

All modules
ReLU(inplace=True)
Before changing activation
ReLU(inplace=True)
SELU()
after changing activation
ReLU(inplace=True)
python neural-network deep-learning pytorch activation-function
2个回答
0
投票
[._modules]为我解决了问题。

for name,child in net.named_children(): if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU): net._modules['relu'] = nn.SELU()


0
投票
setattr为我工作。

import torch import torch.nn as nn # This function will recursively replace all relu module to selu module. def replace_relu_to_selu(model): for child_name, child in model.named_children(): if isinstance(child, nn.ReLU): setattr(model, child_name, nn.SELU()) else: replace_relu_to_selu(child) ########## A toy example ########## net = nn.Sequential( nn.Conv2d(3, 32, kernel_size=3, stride=1), nn.ReLU(inplace=True), nn.Conv2d(3, 32, kernel_size=3, stride=1), nn.ReLU(inplace=True) ) ########## Test ########## print('Before changing activation') for child in net.children(): if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU): print(child) # Before changing activation # ReLU(inplace=True) # ReLU(inplace=True) print('after changing activation') for child in net.children(): if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU): print(child) # after changing activation # SELU() # SELU(

© www.soinside.com 2019 - 2024. All rights reserved.