如何修复AttributeError:'NoneType'对象没有使用曼哈顿距离创建lstm模型时出现的属性'_inbound_nodes'?

问题描述 投票:0回答:1

我正在尝试使用曼哈顿LSTM(例如https://medium.com/mlreview/implementing-malstm-on-kaggles-quora-question-pairs-competition-8b31b0b16a07)创建一个返回两个句子的相似性得分的神经网络模型。我使用了quora-questions对数据集并使用google-bert生成了它们的嵌入。现在,我想创建一个像上面的例子一样的LSTM模型并使用它,但我收到以下错误:

Using TensorFlow backend.
(100000, 1, 768)
(100000, 1, 768)
(100000,)
(100000, 100)
Traceback (most recent call last):
  File "train_model_manhattan.py", line 151, in <module>
    model = Model(inputs=[inp1,inp2], outputs=[malstm_distance])
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 93, in __init__
    self._init_graph_network(*args, **kwargs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 231, in _init_graph_network
    self.inputs, self.outputs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1366, in _map_graph_network
    tensor_index=tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1353, in build_map
    node_index, tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1353, in build_map
    node_index, tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1325, in build_map
    node = layer._inbound_nodes[node_index]
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'

这是我已经尝试过的。请注意,返回的嵌入具有形状(768),即像这样的大小为768的向量[1.2e + 05 2.7e-01 7.8 .... 8.9]


print(np.shape(train_vec1)) => (100000, 1, 768)
print(np.shape(train_vec2))  => (100000, 1, 768)
print(np.shape(train_label))
#################################################
def exponent_neg_manhattan_distance(left, right):
    return np.exp(-np.sum(np.abs(left-right), axis=1, keepdims=True))

def manhattan_distance(left, right):
    ''' Helper function for the similarity estimate of the LSTMs outputs'''
    print(np.shape(left))
    return K.sum(K.abs(left - right), axis=1, keepdims=True)    
#################################################

import keras
from keras.layers import Input, LSTM, Dense
from keras.models import Model


inp1= Input(shape=(768,))
inp2= Input(shape=(768,))


x = keras.layers.concatenate([inp1, inp2],axis=-1)
x = Dense(1024, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(256, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(64, activation='relu')(x)
out=Dense(1)(x)


# Since this is a siamese network, both sides share the same LSTM
shared_lstm = LSTM(100)

left_output = shared_lstm(train_vec1_tensor)
right_output = shared_lstm(train_vec2_tensor)

# Calculates the distance as defined by the MaLSTM model
malstm_distance = Lambda(function=lambda x: manhattan_distance(x[0], x[1]),output_shape=lambda x: (x[0][0], 1))([left_output, right_output])

#######################
Getting error when code flow reaches the following line
#######################
model = Model(inputs=[inp1,inp2], outputs=[malstm_distance])


这是我的整个代码


import os
data_file='quora_duplicate_questions.tsv'
# 0 means dont load, 1 means fetch from file
LOAD_ENCODING_FROM_FILE=1 
encoding_data_file_quest1='encoding_quest1'
encoding_data_file_quest2='encoding_quest2'
encoding_data_file_label='quest_label'

#################################################
import numpy as np
import pandas as pd
import tensorflow as tf
import re
from bert_serving.client import BertClient
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
import numpy as np
import pickle
from keras import models
from keras import layers
from keras import optimizers
from keras.layers import Dropout
from keras import backend as K
from keras.layers import Lambda
#################################################
maxlen = 125  # We will cut reviews after 125 words

# The next step is to tranform all sentences to fixed length encoding using bert embeddings
# [0.1 0.4 0.4] [0.9 0.6 0.1] 2.4
# [0.4 0.1 0.3] [0.5 0.6 0.1] 1.0

# Save the encodings in a file 
if LOAD_ENCODING_FROM_FILE == 1:
    with open(encoding_data_file_quest1, "rb") as fp:
        vec1=pickle.load(fp)
    with open(encoding_data_file_quest2, "rb") as fp:   
        vec2=pickle.load(fp)
    with open(encoding_data_file_label, "rb") as fp: 
        label=pickle.load(fp)


train_vec1 = np.asarray(vec1, np.float32)
train_vec2 = np.asarray(vec2, np.float32)

train_vec1 = train_vec1.reshape((100000,1,768))
train_vec2 = train_vec2.reshape((100000,1,768))


train_vec1_tensor = K.cast(train_vec1,dtype='float32')
train_vec2_tensor = K.cast(train_vec2,dtype='float32')

train_label = np.asarray(label,np.float32)
print(np.shape(train_vec1))
print(np.shape(train_vec2))
print(np.shape(train_label))
#################################################
def exponent_neg_manhattan_distance(left, right):
    return np.exp(-np.sum(np.abs(left-right), axis=1, keepdims=True))

def manhattan_distance(left, right):
    ''' Helper function for the similarity estimate of the LSTMs outputs'''
    print(np.shape(left))
    return K.sum(K.abs(left - right), axis=1, keepdims=True)    
#################################################

import keras
from keras.layers import Input, LSTM, Dense
from keras.models import Model


inp1= Input(shape=(768,))
inp2= Input(shape=(768,))


x = keras.layers.concatenate([inp1, inp2],axis=-1)
x = Dense(1024, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(256, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(64, activation='relu')(x)
out=Dense(1)(x)


# Since this is a siamese network, both sides share the same LSTM
shared_lstm = LSTM(100)

left_output = shared_lstm(train_vec1_tensor)
right_output = shared_lstm(train_vec2_tensor)

# Calculates the distance as defined by the MaLSTM model
malstm_distance = Lambda(function=lambda x: manhattan_distance(x[0], x[1]),output_shape=lambda x: (x[0][0], 1))([left_output, right_output])

#######################
Getting error when code flow reaches the following line
#######################
model = Model(inputs=[inp1,inp2], outputs=[malstm_distance])

model.summary()
optimizer = optimizers.Adadelta(clipnorm=gradient_clipping_norm)
model.compile(optimizer,
              loss='mean_squared_error',
              metrics=['accuracy'])

history=model.fit([train_vec1, train_vec2], train_label, 
    epochs=30,batch_size=200,
    validation_split=0.2)

我希望模型采用两个嵌入,计算嵌入的曼哈顿距离并返回距离。

keras nlp lstm quora
1个回答
1
投票

left_outputright_output来自LSTM层。输入被馈送到Input层并通过一系列Dense层。但是,请注意,Dense图层组和LSTM之间没有任何连接。 Model期望LSTM层的输出是不可能的。这条线keras.layers.concatenate应该使用shared_lstm的输出而不是直接使用输入层的输出。像这样

keras.layers.concatenate([left_output, right_output],axis=-1)

只有,这可能是一个暹罗网络。

© www.soinside.com 2019 - 2024. All rights reserved.