Syntax Keras layer definition

Will the first and second snippets of coe produce the same network?

First:

conv_layer = layers.Conv2D(
    filter_dim,
    (3, 3),
    activation='relu',
    kernel_initializer='he_normal',
    padding='same'
)(previous_layer)

Second:

conv_layer = layers.Conv2D(filter_dim, (3, 3), kernel_initializer='he_normal', padding='same')(previous_layer)
conv_layer = layers.Activation('relu')(conv_layer)

Upvotes: 0

Views: 51

Answers (1)

aerijman
aerijman

Reputation: 2762

Yes. Keras API allows for both.

Look at this example:

#inline
encoder_input = keras.Input(shape=(28, 28, 1), name="img")
x = layers.Conv2D(16, 3, activation="relu")(encoder_input)
encoder_output = layers.GlobalMaxPooling2D()(x)
encoder = keras.Model(encoder_input, encoder_output, name="encoder")
encoder.summary()


# in 2 sentences
encoder_input = keras.Input(shape=(28, 28, 1), name="img")
x = layers.Conv2D(16, 3)(encoder_input)
x = layers.Activation("relu")(x)
encoder_output = layers.GlobalMaxPooling2D()(x)
encoder = keras.Model(encoder_input, encoder_output, name="encoder")
encoder.summary()

You get

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
img (InputLayer)             (None, 28, 28, 1)         0         
_________________________________________________________________
conv2d_25 (Conv2D)           (None, 26, 26, 16)        160       
_________________________________________________________________
global_max_pooling2d_6 (Glob (None, 16)                0         
=================================================================
Total params: 160
Trainable params: 160
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
img (InputLayer)             (None, 28, 28, 1)         0         
_________________________________________________________________
conv2d_26 (Conv2D)           (None, 26, 26, 16)        160       
_________________________________________________________________
global_max_pooling2d_7 (Glob (None, 16)                0         
=================================================================
Total params: 160
Trainable params: 160
Non-trainable params: 0
_________________________________________________________________

Upvotes: 1

Related Questions