mayaaa
mayaaa

Reputation: 299

optimization neural network with genetic algorithm- optimize layers and weights

i try to build neural network model using the following code - multi task model

inp = Input((336,))
x = Dense(300, activation='relu')(inp)
x = Dense(256, activation='relu')(x)
x = Dense(128, activation='relu')(x)
x = Dropout(0.1)(x)
x = Dense(56, activation='relu')(x)
x = Dense(16, activation='relu')(x)
x = Dropout(0.1)(x)
out_reg = Dense(1, name='reg')(x)
out_class = Dense(1, activation='sigmoid', name='class')(x) # I suppose bivariate classification problem

model = Model(inp, [out_reg, out_class])

model.compile('adam', loss={'reg':'mse', 'class':'binary_crossentropy'}, 
              loss_weights={'reg':0.5, 'class':0.5})

now i want to use genetic algorithm optimize neural network weights, layers and number of neurons using genetic algorithm in python i learned many tutorial about it but i didn't find any materiel discuss how to implement it
any help could be appreciated

Upvotes: 0

Views: 838

Answers (2)

user5597655
user5597655

Reputation:

If you are this new to machine learning, I would not recommend using genetic algorithms to optimize your weights. You have already compiled your model with "Adam", which is an excellent gradient-descent based optimizer that is going to do all of the work for you, and you should use that instead.

Check out the Tensorflow quickstart tutorial for more information https://www.tensorflow.org/tutorials/quickstart/beginner

Here's an example of how to implement genetic algorithms from a Google search... https://towardsdatascience.com/introduction-to-genetic-algorithms-including-example-code-e396e98d8bf3

If you want to do hypertuning with genetic algorithms, you can encode hyperparemeters of the network (number of layers, neurons) as your genes. Evaluating the fitness will be very costly, because it would involve having to train the network for a given task to get its final test loss.

If you want to do optimization with genetic algorithms, you can encode the model weights as genes, and the fitness would be directly related to the loss of the network.

Upvotes: 0

Vishnuvardhan Janapati
Vishnuvardhan Janapati

Reputation: 3278

Initially, I think it is better to
- Fix the architecture of the model,
- Know how many trainable parameters are there and their format,
- Create a random population of trainable parameters,
- Define objective function to optimize,
- Implements GA operation (reproduction, crossover, mutation etc),
- Resize these population of weights and biases into correct format,
- Then run ML model with those weights and biases,
- Get loss, and update population and,
- Repeat the above process a number of epoch/with a stopping criteria

Hope it helps.

Upvotes: 1

Related Questions