Reputation: 399
I am using Keras Tuner to tune the hyperparameters of my neural network.
I want to search the optimal number of hidden layers and the optimal number of units in each layer. To avoid overparametrizing the model, I want to impose the following condition:
How can this condition be imposed?
If I have tried this:
for i in range(hp.Choice('num_layers', [1, 2])):
max_units = 128 if i == 1 else 64
hp_units = hp.Int(f'units_{i}', min_value=16, max_value=max_units, step=16)
model.add(tf.keras.layers.Dense(units=hp_units, activation='relu', use_bias=True))
But this just results in the following condition:
Upvotes: 2
Views: 943
Reputation: 601
I think it is better two just make two hparams choice variables, each for the unit count in one layer. If the second layer has zero units, it vanishes entirely.
neurons_first_layer = hp.Choice('neurons_first_layer', [16,32,64,128])
neurons_second_layer = hp.Choice('neurons_second_layer', [0,16,32,64,])
model.add(tf.keras.layers.Dense(units=neurons_first_layer, activation='relu', use_bias=True))
if neurons_second_layer: # if second layer has units
model.add(tf.keras.layers.Dense(units=neurons_second_layer,activation='relu', use_bias=True))
That way your get 16 combinations:
[(16, 0), (16, 16), (16, 32), (16, 64), (32, 0), (32, 16), (32, 32),
(32, 64), (64, 0), (64, 16), (64, 32), (64, 64), (128, 0), (128, 16), ...]
Upvotes: 2