CNN 在显示低损失值后为图像的每个像素和每个输入图像生成相同的值

问题描述 投票:0回答:1

我正在训练一个卷积神经网络,以网格形式的时间序列数据作为输入,时间序列数据作为输出。训练时损失非常低,当我查看训练期间产生的输出时,它非常好,与目标非常相似。但是,一旦训练完成并且我传递了训练数据的输入,它就会给出奇怪的结果,其值与每个像素的输出相同。即使对于不同的图像,它也是相同的精确值,即输出仅产生一个值并用它填充每个框。

训练时:

Input: tensor([0.4227, 0.4396, 0.4807, 0.5201, 0.5685, 0.6009, 0.5774, 0.5286, 0.4780,
        0.4228, 0.3570, 0.3578, 0.3667, 0.4322, 0.5659, 0.6624, 0.7082, 0.7285,
        0.6903, 0.6409, 0.6031, 0.4915, 0.3842, 0.3293, 0.3065, 0.3234, 0.4052,
        0.5047, 0.5763, 0.6334, 0.6080, 0.5724, 0.5240, 0.4635, 0.4294, 0.3948,
        0.4014, 0.4336, 0.5007, 0.5419, 0.5733, 0.5877, 0.5483, 0.5295, 0.4945,
        0.5066, 0.5381, 0.5620, 0.5508, 0.5609, 0.5571, 0.5736, 0.5208, 0.4679,
        0.3421, 0.2895, 0.2682, 0.3878, 0.4691, 0.5377, 0.5905, 0.5975, 0.6014,
        0.5705, 0.5285, 0.4112, 0.3662, 0.3606, 0.4075, 0.4676, 0.5179, 0.5378,
        0.5755, 0.5833, 0.5459, 0.5206, 0.4838, 0.4425, 0.4292, 0.4087, 0.4459,
        0.5064, 0.5535, 0.5751, 0.5140, 0.4675, 0.3972, 0.3673, 0.3539, 0.3515,
        0.3994, 0.4857, 0.5820, 0.6970, 0.7130, 0.6680, 0.5708, 0.4864, 0.4359,
        0.4105], device='cuda:0')
Output: tensor([0.4614, 0.4805, 0.5091, 0.5421, 0.5764, 0.5922, 0.5825, 0.5401, 0.4735,
        0.4073, 0.3615, 0.3559, 0.3913, 0.4667, 0.5621, 0.6461, 0.7030, 0.7202,
        0.7061, 0.6570, 0.5732, 0.4780, 0.3900, 0.3353, 0.3184, 0.3438, 0.4079,
        0.4936, 0.5760, 0.6204, 0.6218, 0.5804, 0.5087, 0.4384, 0.3935, 0.3855,
        0.4119, 0.4582, 0.5110, 0.5510, 0.5716, 0.5718, 0.5569, 0.5388, 0.5202,
        0.5140, 0.5222, 0.5411, 0.5680, 0.5872, 0.5868, 0.5599, 0.5009, 0.4232,
        0.3503, 0.3091, 0.3084, 0.3538, 0.4377, 0.5303, 0.6024, 0.6295, 0.6133,
        0.5619, 0.4942, 0.4333, 0.3954, 0.3897, 0.4099, 0.4527, 0.5052, 0.5516,
        0.5841, 0.5913, 0.5733, 0.5360, 0.4865, 0.4438, 0.4207, 0.4274, 0.4616,
        0.5082, 0.5488, 0.5582, 0.5297, 0.4690, 0.3972, 0.3467, 0.3321, 0.3583,
        0.4223, 0.5072, 0.5935, 0.6520, 0.6777, 0.6678, 0.6242, 0.5574, 0.4826,
        0.4245], device='cuda:0', grad_fn=<SelectBackward0>)

训练结束后:

Input: tensor([0.4992, 0.4980, 0.5664, 0.5931, 0.6129, 0.6245, 0.6198, 0.5699, 0.4931,
        0.4925, 0.4626, 0.4116, 0.3559, 0.3145, 0.3470, 0.4358, 0.5450, 0.6205,
        0.6392, 0.6169, 0.5601, 0.4736, 0.4324, 0.3986, 0.3871, 0.3342, 0.3693,
        0.4783, 0.5535, 0.6388, 0.6912, 0.6931, 0.6628, 0.6182, 0.5680, 0.5506,
        0.5266, 0.4951, 0.4773, 0.5055, 0.4809, 0.4301, 0.4123, 0.4608, 0.4991,
        0.5286, 0.5015, 0.4570, 0.4586, 0.5149, 0.5658, 0.6096, 0.5619, 0.5430,
        0.4721, 0.4153, 0.4204, 0.4200, 0.4546, 0.5500, 0.6355, 0.7143, 0.7294,
        0.7051, 0.6539, 0.5768, 0.4502, 0.3775, 0.3197, 0.3129, 0.3145, 0.3578,
        0.4287, 0.5072, 0.5717, 0.5766, 0.5901, 0.5387, 0.4811, 0.3947, 0.3430,
        0.3755, 0.4364, 0.5128, 0.5465, 0.5489, 0.5900, 0.6103, 0.6149, 0.6073,
        0.5876, 0.5211, 0.4783, 0.4792, 0.4736, 0.4787, 0.4713, 0.4335, 0.3832,
        0.3183])

Output: tensor([0.5465, 0.5463, 0.5463, 0.5464, 0.5465, 0.5463, 0.5463, 0.5462, 0.5460,
        0.5463, 0.5464, 0.5462, 0.5462, 0.5461, 0.5459, 0.5463, 0.5465, 0.5462,
        0.5462, 0.5462, 0.5460, 0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459,
        0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462,
        0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459,
        0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462,
        0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459,
        0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462,
        0.5462, 0.5462, 0.5459, 0.5462, 0.5466, 0.5462, 0.5462, 0.5461, 0.5459,
        0.5462, 0.5466, 0.5462, 0.5462, 0.5462, 0.5459, 0.5462, 0.5465, 0.5462,
        0.5461, 0.5461, 0.5459, 0.5461, 0.5465, 0.5462, 0.5461, 0.5461, 0.5460,
        0.5461], device='cuda:0') 

什么原因导致这种情况发生,我很困惑!?

pytorch neural-network conv-neural-network
1个回答
0
投票

我猜想与训练期间的预处理步骤有关,而在推理过程中可能会丢失这些步骤。你能在这里分享脚本吗?

© www.soinside.com 2019 - 2024. All rights reserved.