深度神经网络隐藏层计算的函数雅可比矩阵

问题描述 投票:0回答:1

考虑以下定义深度神经网络的类。

class NeuralNetwork(nn.Module):
    def __init__(self, input_size, output_size):
        super(NeuralNetwork, self).__init__()
        self.fc1 = nn.Linear(input_size, 64)
        self.relu = nn.ReLU()
        self.fc2 = nn.Linear(64, 2)
        self.relu = nn.ReLU()
        self.fc3 = nn.Linear(2, output_size)
        self.sigmoid = nn.Sigmoid()

    def forward(self, x):
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        x = self.relu(x)
        x = self.fc3(x)
        x = self.sigmoid(x)
        return x

    # Function to get hidden layer outputs
    def hidden_mapping(self, point):
        point = self.fc1(point)
        point = self.relu(point)
        point = self.fc2(point)
        return x.detach()

    def differentiable_hidden_mapping(self, point):
        """
        @point: a tensor like torch.tensor([x, y], requires_grad=True)
        """
        point = self.fc1(point)
        point = self.relu(point)
        point = self.fc2(point)
        return point[0], point[1]

model
成为网络的一个实例,如下所示:

model = NeuralNetwork(
    input_size=2,   # Two features to classify
    output_size=1   # Binary classification, so one output neuron
    )

并假设它已经接受过训练。我将 model. Differentiable_hidden_mapping 视为从 R^2 到 R^2 的函数,因此我想计算其雅可比行列式 w.r.t 网络的输入,而不是相对于权重。

使用以下命令

from torch.autograd.functional import jacobian

point = torch.tensor([1., 1.], requires_grad=True)
model.eval()
print(model.differentiable_hidden_mapping(point))
print(jacobian(model.differentiable_hidden_mapping, point))

我得到了

(tensor(-28.6816, grad_fn=<SelectBackward0>), tensor(-0.4647, grad_fn=<SelectBackward0>))
(tensor([0., 0.]), tensor([0., 0.]))

我没想到雅可比行列式会全为零,而且我不明白为什么无论

point
的值是多少,它们总是存在。

python deep-learning pytorch neural-network
1个回答
0
投票

试试这个:

model = NeuralNetwork(
    input_size=2,
    output_size=2  # Updated output size to 2
)

point = torch.tensor([1., 1.], requires_grad=True)
model.eval()
print(model.differentiable_hidden_mapping(point))
jacobian_matrix = jacobian(model.differentiable_hidden_mapping, point)
    print(jacobian_matrix)
© www.soinside.com 2019 - 2024. All rights reserved.