无法通过贡献模块在keras中使用PELU或SineRelu激活功能

问题描述 投票:0回答:1

当我尝试更换LeakyRELU或使用SineRELU或PELU进行工作编码时。我一直收到这个错误:

ValueError:未知激活功能:PELU

我正在使用keras.contrib。我附上了示例代码。我在几个代码的和平中尝试过它。任何实现这一点的方法都将受到赞赏。

from keras.layers import Dense, Input, LeakyReLU, UpSampling2D, Conv2D, Concatenate
from keras_contrib.layers import SineReLU
from keras.models import Model,load_model,  Sequential
from keras.optimizers import Adam

# Recommended method; requires knowledge of the underlying architecture of the model
from keras_contrib.layers import PELU

import numpy
# fix random seed for reproducibility
numpy.random.seed(7)

# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]

# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='PELU'))
model.add(Dense(8, activation='PELU'))
model.add(Dense(1, activation='sigmoid'))

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10)

# evaluate the model
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

# Create your first MLP in Keras
from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
numpy.random.seed(7)
# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10)
# evaluate the model
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
python tensorflow keras neural-network activation-function
1个回答
1
投票

问题是您没有正确传递激活,图层的activation参数的字符串格式仅适用于内置激活,而不适用于自定义激活。

此外,由于PELU具有参数,因此它被实现为层,而不是独立的激活函数,因此您需要像这样添加它:

model = Sequential()
model.add(Dense(12, input_dim=8))
model.add(PELU())
model.add(Dense(8))
model.add(PELU())
model.add(Dense(1, activation='sigmoid'))
© www.soinside.com 2019 - 2024. All rights reserved.