如何使用keras提取CNN激活?

问题描述 投票:1回答:1

我想使用keras从第一个完全连接的层中提取CNN激活Caffe中有这样的功能,但是由于遇到安装问题,所以无法使用该框架。我正在阅读使用这些CNN激活的研究论文,但是作者正在使用Caffe。

是否有一种方法可以提取这些CNN激活,所以我可以通过使用数据挖掘关联规则先验算法将它们用作事务中的项。

当然,首先我必须提取CNN激活的k最大幅度。因此,每个图像将是一个事务,每个激活将是一个项目。

到目前为止,我有以下代码:

from __future__ import print_function
import keras
from keras.datasets import mnist
from keras.layers import Dense, Flatten
from keras.layers import Conv2D, MaxPooling2D
from keras.models import Sequential
import matplotlib.pylab as plt

model = Sequential()
model.add(Conv2D(32, kernel_size=(5, 5), strides=(1, 1),
                 activation='relu',
                 input_shape=input_shape))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Conv2D(64, (5, 5), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(1000, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.Adam(),
              metrics=['accuracy'])
python keras conv-neural-network caffe feature-extraction
1个回答
0
投票

使用Tensorflow Keras提及以下解决方案。

为了能够访问Activations,首先我们应该传递一个或多个图像,然后激活对应于那些图像。

通过Input Image及其preprocessing的代码如下所示:

from tensorflow.keras.preprocessing import image

Test_Dir = '/Deep_Learning_With_Python_Book/Dogs_Vs_Cats_Small/test/cats'
Image_File = os.path.join(Test_Dir, 'cat.1545.jpg')

Image = image.load_img(Image_File, target_size = (150,150))

Image_Tensor = image.img_to_array(Image)

print(Image_Tensor.shape)

Image_Tensor = tf.expand_dims(Image_Tensor, axis = 0)

Image_Tensor = Image_Tensor/255.0

一旦定义了模型,我们就可以使用下面显示的代码(对于Cat和Dog数据集)访问任何图层的Activations

# Extract the Model Outputs for all the Layers
Model_Outputs = [layer.output for layer in model.layers]
# Create a Model with Model Input as Input and the Model Outputs as Output
Activation_Model = Model(model.input, Model_Outputs)
Activations = Activation_Model.predict(Image_Tensor)

First Fully Connected Layer的输出(关于猫和狗的数据是:

print('Shape of Activation of First Fully Connected Layer is', Activations[-2].shape)
print('------------------------------------------------------------------------------------------')
print('Activation of First Fully Connected Layer is', Activations[-2])

其输出如下所示:

Shape of Activation of First Fully Connected Layer is (1, 512)
------------------------------------------------------------------------------------------
Activation of First Fully Connected Layer is [[0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02759874 0.         0.         0.         0.
  0.         0.         0.00079661 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.04887392 0.         0.
  0.04422646 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.01124999
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.00286965 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.00027195 0.
  0.         0.02132209 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00511147 0.         0.         0.02347952 0.
  0.         0.         0.         0.         0.         0.
  0.02570331 0.         0.         0.         0.         0.03443285
  0.         0.         0.         0.         0.         0.
  0.         0.0068848  0.         0.         0.         0.
  0.         0.         0.         0.         0.00936454 0.
  0.00389365 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00152553 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.09215052 0.         0.         0.0284613  0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00198757 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.02395868 0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.01150922 0.0119792
  0.         0.         0.         0.         0.         0.
  0.00775307 0.         0.         0.         0.         0.
  0.         0.         0.         0.01026413 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.01522083 0.         0.00377031 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02235368 0.         0.         0.         0.
  0.         0.         0.         0.         0.00317057 0.
  0.         0.         0.         0.         0.         0.
  0.03029975 0.         0.         0.         0.         0.
  0.         0.         0.03843511 0.         0.         0.
  0.         0.         0.         0.         0.         0.02327696
  0.00557329 0.         0.02251234 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.01655817 0.         0.
  0.         0.         0.         0.         0.00221658 0.
  0.         0.         0.         0.02087847 0.         0.
  0.         0.         0.02594821 0.         0.         0.
  0.         0.         0.01515464 0.         0.         0.
  0.         0.         0.         0.         0.00019883 0.
  0.         0.         0.         0.         0.         0.00213376
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.00237587
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.02521542 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00490679 0.         0.04504126 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.        ]]

有关更多信息,请参阅本书的5.4.1可视化中间激活部分,由Keras的父亲Francois Chollet撰写的Deep Learning Using Python

希望这会有所帮助。祝您学习愉快!

© www.soinside.com 2019 - 2024. All rights reserved.