Reputation: 41
model = Sequential()
model.add(Conv2D(256, (3, 3), input_shape=X.shape[1:]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
How do I use Maxout instead of'relu' for activation?
Upvotes: 0
Views: 641
Reputation: 2066
You can use tensorflow_addons.layers.Maxout
to add Maxout
Activation function
import tensorflow_addons as tfa
model = Sequential()
model.add(Conv2D(256, (3, 3), input_shape=X.shape[1:]))
model.add(tfa.layers.Maxout(256))
model.add(MaxPooling2D(pool_size=(2, 2)))
You can install tensorflow_addons by:
pip install tensorflow-addons
Upvotes: 1