Abhimanyu Sharma
Abhimanyu Sharma

Reputation: 903

What is the difference between Activation layer and activation keyword argument

guys what is the difference between activation kwarg and Activation layer in tensorflow?

here's an example :

activation kwarg :

model.add(Dense(64,activation="relu"))

Activation layer :

model.add(Dense(64))
model.add(Activation("sigmoid"))

PS: im new to tensorflow

Upvotes: 0

Views: 883

Answers (1)

bui
bui

Reputation: 1651

In Dense(64,activation="relu"), the relu activation function becomes a part of Dense layer, and will be called automatically whenever this Dense layer is called.

In Activation("relu"), the relu activation function is a layer itself and is decoupled from the Dense layer. This is necessary if you want to have a reference to the tensor after Dense but before activation for, says, branching purposes.

input_tensor = Input((10,))
intermediate_tensor = Dense(64)(input_tensor)
branch_1_tensor = Activation('relu')(intermediate_tensor)
branch_2_tensor = Dense(64)(intermediate_tensor)
final_tensor = branch_1_tensor + branch_2_tensor

model = Model(inputs=input_tensor, outputs=final_tensor)

However, your model is Sequential model so your two samples are effectively equal: the relu activation function will be called automatically. To obtain a reference to the tensor before Activation in this case, you can go through model.layers and get the output of the Dense layer from within.

Upvotes: 1

Related Questions