Tensorflow与PyTorch:卷积不起作用

问题描述 投票:0回答:2

我试图测试Tensorflow卷积输出是否与相同权重的PyTorch卷积输出匹配。

这是我的代码,我将权重从Tensorflow复制到Torch,进行卷积和比较输出:

import tensorflow as tf
import numpy as np
import math
from math import floor, ceil
import os
import math
import datetime
from scipy import misc
from PIL import Image
import model
import torch
from torch import nn
import common
import torch.nn.functional as F


sess = tf.Session()
np.random.seed(1)
tf.set_random_seed(1)

#parameters
kernel_size = 3
input_feat = 4
output_feat = 4

#inputs
npo = np.random.random((1,5,5, input_feat))
x = tf.convert_to_tensor(npo, tf.float32)
x2 = torch.tensor(np.transpose(npo, [0, 3, 1, 2])).double()

#the same weights
weights = np.random.random((kernel_size,kernel_size,input_feat,output_feat))
weights_torch = np.transpose(weights, [3, 2, 1, 0])

#convolving with tensorflow
w = tf.Variable(weights, name="testconv_W", dtype=tf.float32)
res = tf.nn.conv2d(x, w, strides=[1, 1, 1, 1], padding="VALID")

sess.run(tf.global_variables_initializer())


#convolving with torch
torchres = F.conv2d(x2, torch.tensor(weights_torch), padding=0, bias=torch.zeros((output_feat)).double())

#comparing the results
print(np.mean(np.transpose(sess.run(res), [0, 3, 1, 2])) - torch.mean(torchres).detach().numpy())

它输出

0.15440369065716908

为什么?为什么会有这么大的差异? Tensorflow conv2d实现是否不正确?为什么它不匹配PyTorch?难道我做错了什么?在内核大小1上一切正常。请帮忙。

python tensorflow deep-learning pytorch
2个回答
0
投票

你可以试试x2 = torch.tensor(np.transpose(npo, [0, 3, 2, 1])).double()而不是x2 = torch.tensor(np.transpose(npo, [0, 3, 1, 2])).double()


-1
投票

检查超级参数。我无法想象在完全相同的情况下结果是不同的,特别是如果所有权重矩阵的初始化与两个库完全相同。

希望这可以帮助。

© www.soinside.com 2019 - 2024. All rights reserved.