zyy
zyy

Reputation: 1574

How do I modify the activation functions from keras?

I would like to use the activation function relu with its parameter alpha set to 0.2, but I could not figure out how this can be done for my model

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam

model_input = Input(shape = x_train[0].shape)
x = Dense(120, activation = 'relu')(model_input)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
x = Dense(120, activation = 'relu')(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

I saw there is a way to do this in this answer, which uses model.add(). But I am not sure how this could work for me, could you please help me?

Thank you in advance!

Upvotes: 1

Views: 690

Answers (1)

nbro
nbro

Reputation: 15837

First, note that you're specifying the activation as a string, while in the example provided in the answer you're linking us to the activation function is specified by creating the object of the class representing the activation function. Second, note that you want to use the "leaky ReLU" activation function, while you're currently specifying only "relu" as the activation function.

To answer your question, you can probably do something like this

import numpy
from tensorflow.keras.layers import Dense, Activation, Dropout, Input
from tensorflow.keras.models import Sequential, Model, load_model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.layers import LeakyReLU

model_input = Input(shape = x_train[0].shape)
x = Dense(120)(model_input)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
x = Dense(120)(x)
x = LeakyReLU(alpha=0.2)(x)
x = Dropout(0.01)(x)
model_output = Dense(numpy.shape(y_train)[1])(x)
model = Model(model_input, model_output)

I haven't tried this code, but it should work!

Upvotes: 1

Related Questions