无法通过图层名称手动分配新参数

问题描述 投票:2回答:1

我正在尝试为我的pytorch模型手动分配新的权重。我可以分配这样的新权重:

import scipy.io as sio
import torch

caffe_params = sio.loadmat('export_conv1_1.mat')
net.conv1_1.weight = torch.nn.Parameter(torch.from_numpy(caffe_params['w']))
net.conv1_1.bias = torch.nn.Parameter(torch.from_numpy(caffe_params['b']))

caffe_params = sio.loadmat('export_conv2_1.mat')
net.conv2_1.weight = torch.nn.Parameter(torch.from_numpy(caffe_params['w']))
net.conv2_1.bias = torch.nn.Parameter(torch.from_numpy(caffe_params['b']))

由于我有很多层,我不想通过它的名称手动分配每一层。相反,我希望循环遍历图层名称列表并自动分配它们。像这样的东西:

varList = ['conv2_1','conv2_2']

for name in varList:
    caffe_params = sio.loadmat(rootDir + 'export_' + name + '.mat')
    setattr(net, name + '.weight' ,torch.nn.Parameter(torch.from_numpy(caffe_params['w'])))
    setattr(net, name + '.bias' ,torch.nn.Parameter(torch.from_numpy(caffe_params['b'])))

不幸的是,这不起作用。我猜setattr不适用于pytorch weigths或类型为layername.weight的属性,这意味着赋值的属性相对于net具有深度2。

pytorch setattr
1个回答
1
投票

以下工作如何?

for name in varList:
    caffe_params = sio.loadmat(rootDir + 'export_' + name + '.mat')
    getattr(net, name).weight.data.copy_(torch.from_numpy(caffe_params['w']))
    getattr(net, name).bias.data.copy_(torch.from_numpy(caffe_params['b']))
© www.soinside.com 2019 - 2024. All rights reserved.