Sahas
Sahas

Reputation: 3186

Keras model.fit runs out of memory in Google Colab Pro

I'm trying out a simple sequential model with the below dataset. Using Colab PRO with 35 GB RAM + 225 GB Disk space.

  1. Total sentences - 59000
  2. Total words - 160000
  3. Padded seq length - 38

So train_x (59000,37), train_y (59000)

I'm using FastText for embedding layer. FastText model generated weights with (rows) vocab_size 113000 (columns/dimentionality) embedding_size 8815

Here is how the model.summary() looks like

enter image description here

It takes about ~15 mins to compile the model but .fit crashes without adequate memory.

I've brought down the batch_size to 4 (vs 32 default).. still no luck.

epochs=2 verbose=0 batch_size=4

history = seq_model.fit(train_x,train_y, epochs=epochs, verbose=verbose,callbacks=[csv_logger],batch_size=batch_size)

Appreciate any ideas to make this work.

Upvotes: 0

Views: 943

Answers (1)

Pushpak Bhoge
Pushpak Bhoge

Reputation: 69

If what I am seeing is right, your model is simply too large! It has almost 1.5 billion parameters. That's way too much. Reducing the batch size will not help at all.

Upvotes: 1

Related Questions