了解Keras中的指标计算

问题描述 投票:1回答:1

我试图在Keras中实现一个真正的积极指标:

def TP(y_true, y_pred):
    estimated = K.argmax(y_pred, axis=1)
    truth = K.argmax(y_true, axis=1)
    TP = K.sum(truth * estimated)
    return TP

基于我的最后一层输出形状:(批次,2)。该功能已经使用numpy argmax等效测试,效果很好。

我使用交叉熵损失函数,每个时期给出了度量值。但这个值怎么可能是十进制数?我究竟做错了什么 ?谢谢 !

编辑:这是Keras模型的示例代码:

def TP(y_true, y_pred):
    estimated = K.argmax(y_pred, axis=1)
    truth = K.argmax(y_true, axis=1)
    TP = K.sum(truth * estimated)
    return TP

epochs = 10
batch_size = 2

model = Sequential([
        Dense(32, input_shape=(4,)),
        Activation('relu'),
        Dense(2),
        Activation('softmax'),
])
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy', TP])

model.summary()

train = np.array([[17,0,1,0],[17,0,1,0],[17,0,1,0],[17,0,1,0],[17,0,1,0], [2,1,0,1],[0,1,0,1],[0,1,0,1],[0,1,0,1],[0,1,0,1]])
labels = np.array([ [1,0],[1,0],[1,0],[1,0],[1,0], [0,1],[0,1],[0,1],[0,1],[0,1] ])

model.fit(train, labels, epochs=epochs, batch_size=batch_size, verbose=2)

这里显示TP功能的测试似乎有效

def npTP(y_true, y_pred):
    estimated = np.argmax(y_pred, axis=1)
    truth = np.argmax(y_true, axis=1)
    TP = np.sum(truth * estimated)
    return TP

y_true = np.array([ [1,0],[1,0],[1,0],[1,0],[1,0], [0,1],[0,1],[0,1],[0,1],[0,1] ])
y_pred = np.array([ [0,1],[0,1],[0,1],[0,1],[0,1], [0,1],[0,1],[0,1],[0,1],[0,1]])
print("np check : ")
print(npTP(y_true, y_pred))

运行此代码将提供以下输出:

Using TensorFlow backend.

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 32)                160       
_________________________________________________________________
activation_1 (Activation)    (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 2)                 66        
_________________________________________________________________
activation_2 (Activation)    (None, 2)                 0         
=================================================================
Total params: 226
Trainable params: 226
Non-trainable params: 0
_________________________________________________________________
Epoch 1/10
 - 0s - loss: 0.3934 - acc: 0.6000 - TP: 0.2000
Epoch 2/10                           ^^^^^^^^^^ here are the decimal values
 - 0s - loss: 0.3736 - acc: 0.6000 - TP: 0.2000
Epoch 3/10                           ^^^^^^^^^^
 - 0s - loss: 0.3562 - acc: 0.6000 - TP: 0.2000
Epoch 4/10                           ^^^^^^^^^^
 - 0s - loss: 0.3416 - acc: 0.7000 - TP: 0.4000
Epoch 5/10                           ^^^^^^^^^^
 - 0s - loss: 0.3240 - acc: 1.0000 - TP: 1.0000
Epoch 6/10
 - 0s - loss: 0.3118 - acc: 1.0000 - TP: 1.0000
Epoch 7/10
 - 0s - loss: 0.2960 - acc: 1.0000 - TP: 1.0000
Epoch 8/10
 - 0s - loss: 0.2806 - acc: 1.0000 - TP: 1.0000
Epoch 9/10
 - 0s - loss: 0.2656 - acc: 1.0000 - TP: 1.0000
Epoch 10/10
 - 0s - loss: 0.2535 - acc: 1.0000 - TP: 1.0000

np check : 
5

谢谢 !

keras
1个回答
0
投票

当desertnaut指出,答案解释in this thread

Keras正在批量和时代之间进行平均运行。

在这里使用batch_size=2和10个样本,每个时期运行5次训练(10/2=5)。

为了理解时代1的输出度量,5次训练后的TP总数必须为1,因此度量给出1/5 = 0.2。 Epoch 4在5次训练中有2个TP,在指标中给出了2/5 = 0.4

© www.soinside.com 2019 - 2024. All rights reserved.