我正在学习神经网络,我想在python中编写一个函数cross_entropy
。在哪里定义为
其中N
是样本数,k
是类的数量,log
是自然对数,t_i,j
是1,如果样本i
在类j
和0
否则,p_i,j
是样本i
在类j
的预测概率。要避免对数的数值问题,请将预测剪辑到[10^{−12}, 1 − 10^{−12}]
范围。
根据上面的描述,我通过clippint对[epsilon, 1 − epsilon]
范围的预测写下代码,然后根据上面的公式计算交叉熵。
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
ce = - np.mean(np.log(predictions) * targets)
return ce
以下代码将用于检查函数cross_entropy
是否正确。
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
上述代码的输出为False,即我的用于定义函数cross_entropy
的代码不正确。然后我打印cross_entropy(predictions, targets)
的结果。它给了0.178389544455
,正确的结果应该是ans = 0.71355817782
。有人可以帮我查一下我的代码有什么问题吗?
你根本不是那么遥远,但记住你正在取N个和的平均值,其中N = 2(在这种情况下)。所以你的代码可以读取:
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
N = predictions.shape[0]
ce = -np.sum(targets*np.log(predictions+1e-9))/N
return ce
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782 #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
在这里,我认为如果你坚持使用np.sum()
会更清楚一些。另外,我在np.log()
中添加了1e-9,以避免在计算中出现log(0)的可能性。希望这可以帮助!
注意:根据@ Peter的评论,如果你的epsilon值大于1e-9
,那么0
的偏移确实是多余的。
def cross_entropy(x, y):
""" Computes cross entropy between two distributions.
Input: x: iterabale of N non-negative values
y: iterabale of N non-negative values
Returns: scalar
"""
if np.any(x < 0) or np.any(y < 0):
raise ValueError('Negative values exist.')
# Force to proper probability mass function.
x = np.array(x, dtype=np.float)
y = np.array(y, dtype=np.float)
x /= np.sum(x)
y /= np.sum(y)
# Ignore zero 'y' elements.
mask = y > 0
x = x[mask]
y = y[mask]
ce = -np.sum(x * np.log(y))
return ce
def cross_entropy_via_scipy(x, y):
''' SEE: https://en.wikipedia.org/wiki/Cross_entropy'''
return entropy(x) + entropy(x, y)
from scipy.stats import entropy, truncnorm
x = truncnorm.rvs(0.1, 2, size=100)
y = truncnorm.rvs(0.1, 2, size=100)
print np.isclose(cross_entropy(x, y), cross_entropy_via_scipy(x, y))