毛毛嵌入层用Keras

问题描述 投票:0回答:1

我一直在使用Keras默认在我的建筑与嵌入的嵌入字层。架构是这样的 -

left_input = Input(shape=(max_seq_length,), dtype='int32')
right_input = Input(shape=(max_seq_length,), dtype='int32')

embedding_layer = Embedding(len(embeddings), embedding_dim, weights=[embeddings], input_length=max_seq_length,
                            trainable=False)

# Since this is a siamese network, both sides share the same LSTM
shared_lstm = LSTM(n_hidden, name="lstm")

left_output = shared_lstm(encoded_left)
right_output = shared_lstm(encoded_right)

我想,以取代ELMO的嵌入嵌入层。所以我用一个自定义的埋层 - 在这种回购发现 - https://github.com/strongio/keras-elmo/blob/master/Elmo%20Keras.ipynb。嵌入层看起来是这样的 -

class ElmoEmbeddingLayer(Layer):
def __init__(self, **kwargs):
    self.dimensions = 1024
    self.trainable=True
    super(ElmoEmbeddingLayer, self).__init__(**kwargs)

def build(self, input_shape):
    self.elmo = hub.Module('https://tfhub.dev/google/elmo/2', trainable=self.trainable,
                           name="{}_module".format(self.name))

    self.trainable_weights += K.tf.trainable_variables(scope="^{}_module/.*".format(self.name))
    super(ElmoEmbeddingLayer, self).build(input_shape)

def call(self, x, mask=None):
    result = self.elmo(K.squeeze(K.cast(x, tf.string), axis=1),
                  as_dict=True,
                  signature='default',
                  )['default']
    return result

def compute_mask(self, inputs, mask=None):
    return K.not_equal(inputs, '--PAD--')

def compute_output_shape(self, input_shape):
    return (input_shape[0], self.dimensions)

我改变了架构新埋设层。

 # The visible layer
left_input = Input(shape=(1,), dtype="string")
right_input = Input(shape=(1,), dtype="string")

embedding_layer = ElmoEmbeddingLayer()

# Embedded version of the inputs
encoded_left = embedding_layer(left_input)
encoded_right = embedding_layer(right_input)

# Since this is a siamese network, both sides share the same LSTM
shared_lstm = LSTM(n_hidden, name="lstm")

left_output = shared_gru(encoded_left)
right_output = shared_gru(encoded_right)

但我得到的错误 -

ValueError异常:输入0是与层LSTM不相容:预期NDIM = 3,实测NDIM = 2

我在做什么错在这里?

python keras deep-learning lstm word-embedding
1个回答
2
投票

所述毛毛嵌入层输出每个输入一个嵌入(因此输出形状是(batch_size, dim)),而你的LSTM期望的序列(即,形状(batch_size, seq_length, dim))。我不认为它使多大意义,一个埃尔莫埋层后LSTM层,因为毛毛已经使用了LSTM嵌入的字序列。


1
投票

我还使用了仓库为指导,以构建一个CustomELMo + BiLSTM + CRF模型,我需要的字典查询更改为“毛毛”,而不是“默认”。正如安娜Krogager指出,当字典查找是“默认”的输出是(的batch_size,暗淡),这是不够的尺寸为LSTM。然而,当字典查找是[“毛毛”]的层返回即形状(的batch_size,MAX_LENGTH,1024)的右边尺寸的张量。

自ELMO层:

class ElmoEmbeddingLayer(Layer):
def __init__(self, **kwargs):
    self.dimensions = 1024
    self.trainable = True
    super(ElmoEmbeddingLayer, self).__init__(**kwargs)

def build(self, input_shape):
    self.elmo = hub.Module('https://tfhub.dev/google/elmo/2', trainable=self.trainable,
                           name="{}_module".format(self.name))

    self.trainable_weights += K.tf.trainable_variables(scope="^{}_module/.*".format(self.name))
    super(ElmoEmbeddingLayer, self).build(input_shape)

def call(self, x, mask=None):
    result = self.elmo(K.squeeze(K.cast(x, tf.string), axis=1),
                       as_dict=True,
                       signature='default',
                       )['elmo']
    print(result)
    return result

# def compute_mask(self, inputs, mask=None):
#   return K.not_equal(inputs, '__PAD__')

def compute_output_shape(self, input_shape):
    return input_shape[0], 48, self.dimensions

并构建模型如下:

def build_model(): # uses crf from keras_contrib
    input = layers.Input(shape=(1,), dtype=tf.string)
    model = ElmoEmbeddingLayer(name='ElmoEmbeddingLayer')(input)
    model = Bidirectional(LSTM(units=512, return_sequences=True))(model)
    crf = CRF(num_tags)
    out = crf(model)
    model = Model(input, out)
    model.compile(optimizer="rmsprop", loss=crf_loss, metrics=[crf_accuracy, categorical_accuracy, mean_squared_error])
    model.summary()
    return model

我希望我的代码是对你有用,即使是不完全一样的模式。请注意,我不得不注释掉compute_mask方法,因为它抛出

InvalidArgumentError: Incompatible shapes: [32,47] vs. [32,0]    [[{{node loss/crf_1_loss/mul_6}}]]

其中32是批量大小和47比我指定MAX_LENGTH少一个(大概意思是它占垫令牌本身)。我还没有摸索出错误的原因还没有,所以它可能是罚款,你和你的模型。不过,我注意到你正在使用GRU的,并且有一个悬而未决的问题上库有关添加GRU的。所以,我很好奇你是否拿到过isue。

© www.soinside.com 2019 - 2024. All rights reserved.