Reputation: 3583
In this tutorial here, the author used GlobalMaxPool1D()
like this:
from keras.models import Sequential
from keras.layers import Dense, Activation, Embedding, Flatten, GlobalMaxPool1D, Dropout, Conv1D
from keras.callbacks import ReduceLROnPlateau, EarlyStopping, ModelCheckpoint
from keras.losses import binary_crossentropy
from keras.optimizers import Adam
filter_length = 300
model = Sequential()
model.add(Embedding(max_words, 20, input_length=maxlen))
model.add(Dropout(0.1))
model.add(Conv1D(filter_length, 3, padding='valid', activation='relu', strides=1))
model.add(GlobalMaxPool1D())
model.add(Dense(num_classes))
model.add(Activation('sigmoid'))
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['categorical_accuracy'])
model.summary()
callbacks = [
ReduceLROnPlateau(),
EarlyStopping(patience=4),
ModelCheckpoint(filepath='model-conv1d.h5', save_best_only=True)
]
history = model.fit(x_train, y_train,
class_weight=class_weight,
epochs=20,
batch_size=32,
validation_split=0.1,
callbacks=callbacks)
However, after searching online, I could only find GlobalMaxPooling1D
on Keras site here. Are they the same? If not, what's the difference in terms of function and usage?
Upvotes: 1
Views: 2352
Reputation: 15063
I would add that the same is available for other layers, such as Conv2D == Convolution2D, MaxPooling2D == MaxPool2D.
Upvotes: 1
Reputation: 173
They are the same thing. See here: https://www.tensorflow.org/api_docs/python/tf/keras/layers/GlobalMaxPool1D
Upvotes: 5