保存经过训练的神经网络python 3.6

问题描述 投票:0回答:1

我正在学习使用Pthong 3.6和Jupyter的神经网络。正如每个人(我认为)一样,我开始使用在网上找到的示例,但我不知道为什么我不能保存训练有素的神经网络。我正在使用此代码:

fashion_model.save("fashion_model.h5py")

但是我收到此错误:

    TypeError                                 Traceback (most recent call last)
<ipython-input-72-11379a0dd354> in <module>
      1 #FALLA NO SE POR QUE
      2 from keras.models import save_model
----> 3 fashion_model.save("fashion_model.h5py")

C:\Users\Javi\Anaconda3\lib\site-packages\keras\engine\network.py in save(self, filepath, overwrite, include_optimizer)
   1088             raise NotImplementedError
   1089         from ..models import save_model
-> 1090         save_model(self, filepath, overwrite, include_optimizer)
   1091 
   1092     def save_weights(self, filepath, overwrite=True):

C:\Users\Javi\Anaconda3\lib\site-packages\keras\engine\saving.py in save_model(model, filepath, overwrite, include_optimizer)
    380 
    381     try:
--> 382         _serialize_model(model, f, include_optimizer)
    383     finally:
    384         if opened_new_file:

C:\Users\Javi\Anaconda3\lib\site-packages\keras\engine\saving.py in _serialize_model(model, f, include_optimizer)
    112         layer_group['weight_names'] = weight_names
    113         for name, val in zip(weight_names, weight_values):
--> 114             layer_group[name] = val
    115     if include_optimizer and model.optimizer:
    116         if isinstance(model.optimizer, optimizers.TFOptimizer):

C:\Users\Javi\Anaconda3\lib\site-packages\keras\utils\io_utils.py in __setitem__(self, attr, val)
    216                            'Group with name "{}" exists.'.format(attr))
    217         if is_np:
--> 218             dataset = self.data.create_dataset(attr, val.shape, dtype=val.dtype)
    219             if not val.shape:
    220                 # scalar

C:\Users\Javi\Anaconda3\lib\site-packages\h5py\_hl\group.py in create_dataset(self, name, shape, dtype, data, **kwds)
    114         """
    115         with phil:
--> 116             dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)
    117             dset = dataset.Dataset(dsid)
    118             if name is not None:

C:\Users\Javi\Anaconda3\lib\site-packages\h5py\_hl\dataset.py in make_new_dset(parent, shape, dtype, data, chunks, compression, shuffle, fletcher32, maxshape, compression_opts, fillvalue, scaleoffset, track_times)
     97             dtype = data.dtype
     98         else:
---> 99             dtype = numpy.dtype(dtype)
    100         tid = h5t.py_create(dtype, logical=1)
    101 

TypeError: data type not understood

有人知道如何解决吗?我希望能够保存模型以及受过训练的砝码,以便将来无需重新训练即可将其打开。

我也尝试使用此代码,但在第二部分中同样使用,以节省失败的权重。

model_json = fashion_model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
fashion_model.save_weights("model.h5")
print("Saved model to disk")

我也遇到同样的错误

谢谢。

python tensorflow keras neural-network conv-neural-network
1个回答
0
投票

我无法重现您的问题。

您的代码存在缩进问题。您可以使用JSON序列化模型,如下所示

model_json = model.to_json()
with open("model.json", "w") as json_file:
    json_file.write(model_json)

# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk")

您可以加载JSON并创建模型

json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")

在这里,我运行了一个简单的模型,并使用model.save保存了,并用load_model进行了keras加载。您可以从here

下载数据集

建立并保存模型:

import numpy as np
from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense

# load pima indians dataset
dataset = np.loadtxt("/content/pima-indians-diabetes.csv", delimiter=",")

# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]

# define model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

# compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

# Model Summary
model.summary()

# Fit the model
model.fit(X, Y, epochs=150, batch_size=10, verbose=0)

# evaluate the model
scores = model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

# save model and architecture to single file
model.save("model.h5")
print("Saved model to disk")

输出:

Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 12)                108       
_________________________________________________________________
dense_5 (Dense)              (None, 8)                 104       
_________________________________________________________________
dense_6 (Dense)              (None, 1)                 9         
=================================================================
Total params: 221
Trainable params: 221
Non-trainable params: 0
_________________________________________________________________
accuracy: 75.52%
Saved model to disk

加载模型并评估以进行验证:

    # load and evaluate a saved model
    from numpy import loadtxt
    from keras.models import load_model

    # load model
    model = load_model('model.h5')

    # summarize model.
    model.summary()

    # load dataset
    dataset = loadtxt("pima-indians-diabetes.csv", delimiter=",")

    # split into input (X) and output (Y) variables
    X = dataset[:,0:8]
    Y = dataset[:,8]

    # evaluate the model
    score = model.evaluate(X, Y, verbose=0)
    print("%s: %.2f%%" % (model.metrics_names[1], score[1]*100))

Output:

Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 12)                108       
_________________________________________________________________
dense_5 (Dense)              (None, 8)                 104       
_________________________________________________________________
dense_6 (Dense)              (None, 1)                 9         
=================================================================
Total params: 221
Trainable params: 221
Non-trainable params: 0
_________________________________________________________________
accuracy: 75.52%
© www.soinside.com 2019 - 2024. All rights reserved.