KerasでLeakyReLUを使った
完全に自分用のメモ。KerasでLeakyReLUを使おうとしたら怒られたので正しい(?)書き方をメモしておく。
警告を食らったコード
import plaidml.keras plaidml.keras.install_backend() from keras.models import Sequential from keras.layers.core import Dense from keras.layers import LeakyReLU def make_mlp(): model = Sequential() model.add(Dense(256, activation=LeakyReLU(alpha=0.01), input_shape=(3,), kernel_initializer='he_normal')) model.add(Dense(3, activation='linear')) model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mse']) return model
警告メッセージ
keras\plaidml-env\lib\site-packages\keras\activations.py:197: UserWarning: Do not pass a layer instance (such as LeakyReLU) as the activation argument of another layer. Instead, advanced activation layers should be used just like any other layer in a model.
identifier=identifier.__class__.__name__))
修正後のコード
def make_mlp(): model = Sequential() - model.add(Dense(256, activation=LeakyReLU(alpha=0.01), input_shape=(3,), kernel_initializer='he_normal')) + model.add(Dense(256, input_shape=(3,), kernel_initializer='he_normal')) + model.add(LeakyReLU(alpha=0.01)) model.add(Dense(3, activation='linear'))