将K.function与K.gradients一起使用时出错

问题描述 投票:0回答:1

我正在使用keras编写测试模型,我想要做一些数学取决于图层输出的数值及其衍生物。

我正在使用tensorflow后端。我使用K.function来获取Lambda层和派生层的输出值。然而,如果我选择Lambda层中的函数作为幂函数,我会得到一些奇怪的错误。 X ** 2。如果我将x ** 2更改为sin(x),它可以正常工作。

import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda
from keras import backend as K

x = Input(shape=(1,))

# the Lambda layer 
c = Lambda(lambda x: x**2)(x)     # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine


class dc_layer(Layer):

    def __init__(self,*args,**kwargs):
        self.is_placeholder = True
        super(dc_layer, self).__init__(*args,**kwargs)

    def call(self,inputs):
        x = inputs[0]
        c0 = inputs[1]
        c1 = K.gradients(c0,x)
        return c1

# the derivatives of the lambda layer    
c1 = dc_layer()([x,c])
c2 = dc_layer()([x,c1])

然后我使用backend.function来定义一个函数以获得图层输出

# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])

x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)

我在jupyter笔记本中收到以下错误消息

InvalidArgumentError: data[0].shape = [1] does not start with indices[0].shape = [2]

哪个跟踪到

---> 36 val = get_layer_outputs([x_data])[0]

但如果我看一下c1层

# define a function to get the derivatives
get_layer_outputs = K.function([x],[c1])

x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)

它工作正常。

我猜使用K.function时出了点问题。任何解决方案/建议将不胜感激。

======================================================

Additional question:

即使我尝试了一个非常简单的代码,我在使用K.function时也会出错,如下所示

x = Input(shape=(1,))
h = Dense(10,activation='sigmoid')(x)
c = Dense(1)(h)

get_layer_outputs = K.function([x],[c])

x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)

我有

InvalidArgumentError: In[0] is not a matrix
     [[Node: dense_24/MatMul = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_19_0_0, dense_24/kernel/read)]]

现在我对如何正确使用K.function感到困惑。如果您有任何想法,请帮忙。谢谢!

tensorflow keras jupyter-notebook python-3.5
1个回答
0
投票

对我来说这是有效的 - 你的x_data向量是0维的:

import numpy as np

from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K

x = Input(shape=(1,))

# the Lambda layer 
c = Lambda(lambda x: x**2)(x)     # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine


class dc_layer(Layer):

    def __init__(self,*args,**kwargs):
        self.is_placeholder = True
        super(dc_layer, self).__init__(*args,**kwargs)

    def call(self,inputs):
        x = inputs[0]
        c0 = inputs[1]
        c1 = K.gradients(c0,x)
        return c1

# the derivatives of the lambda layer    
c1 = dc_layer()([x,c])  # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])

# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])

x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)

输出:

[[2.]
 [2.]
 [2.]
 [2.]
 [2.]
 [2.]]
© www.soinside.com 2019 - 2024. All rights reserved.