Pytorch:model.parameters() 返回一个空列表

问题描述 投票:0回答:1

我正在尝试构建和训练下面一个非常简单的 2 层神经网络 我遵循 pytorch 教程的模板(https://pytorch.org/tutorials/beginner/basics/buildmodel_tutorial.html

唯一的区别是我正在初始化模型权重

from torch import nn
# create a two layer FCNN, avoid ValueError: optimizer got an empty parameter list
class img2latent(nn.Module):
    def __int__(self):
        super(img2latent,self).__init__()
        self.neuralDim=len(X_train[0])
        self.latentDim=len(Y_train[0])
        self.hiddenDim=self.neuralDim
        self.fc1=nn.Linear(self.neuralDim,self.hiddenDim)
        self.fc2=nn.Linear(self.hiddenDim,self.latentDim)
        # INTITIALISE THE WEIGHTS, FC1 WITH ONES, FC2 WITH PARAMETERS OF RIDGE
        self.fc1.weight.data.fill_(1)
        self.fc1.bias.data.fill_(0)
        self.fc2.weight.data=ridge.coef_
        self.fc2.bias.data=ridge.intercept_
    
    def forward(self,x):
        x=self.fc1(x)
        # add reLU
        x=torch.relu(x)        
        x=self.fc2(x)
        return x
  
    

def train_loop(model,loss_fn, optimizer):
    model.train()
    # do full batch gradient descent
    pred=model(X_train)
    loss=loss_fn(pred,Y_train)
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
    return loss.item()
    
    
fullmodel=img2latent()
# SEND TO GPU
fullmodel=fullmodel.to(device)
# take mse loss + l2 regularaisation  on the weights of the second layer
loss=nn.MSELoss()
optimizer=torch.optim.Adam(fullmodel.parameters(),lr=0.01,weight_decay=0.01)
for t in range(1000):
    loss=train_loop(fullmodel,loss,optimizer)
    if t%100==0:
        print(t,f"{loss:0.2f}",end='\t')

# predict the latents for train data and do a correlation
pred=fullmodel(X_train)

但由于某种原因,我得到

ValueError: optimizer got an empty parameter list
,这意味着
fullmodel.parameters()
返回一个空列表。我检查了这一点,这确实是真的。

我尝试在 stackoverflow 上查找它,但找不到参数本身为空的示例

deep-learning pytorch neural-network
1个回答
0
投票

你的构造函数中有一个拼写错误

    def __int__(self):

应该是

    def __init__(self):

所以你的代码实际上没有构造函数,因此只是生成一个空的 nn.module ,它确实没有参数。

© www.soinside.com 2019 - 2024. All rights reserved.