Michael Jennings
Michael Jennings

Reputation: 21

Memory leak in tfjs-node code even when using tidy() and dispose()

My tfjs-node application is crashing due to running out of memory despite using tidy() and dispose() to prevent memory leaks. I've managed to reproduce the issue in the code below:

import * as tf from '@tensorflow/tfjs-node'

let x = tf.variable(tf.scalar(1))

function model(){
  for(let i = 0; i < 100; i++){
    let b = tf.tidy(() => { return tf.mul(2, 3) })
    b.dispose()
    console.log(tf.memory().numTensors)
  }
  return x
}

//this call shows just 2 tensors throughout each of model()'s iterations
model()

const optimizer = tf.train.sgd(0.001);
optimizer.minimize(() => {
  //this call shows the number of tensors increasing each iteration up to about 200
  return model()
})

Within the model() function, I loop 100 times, each time creating a tensor and then immediately calling dispose() on it. The model() function then returns a variable x.

If I call my model() function normally, the number of active tensors stays at 2 through each of the 100 iterations. This is what I expect to happen. However, when I call model() within optimizer.minimize(), it seems like the dispose() calls are just getting completely ignored. The number of tensors increases each iteration. At the end, there are roughly 200 accumulated tensors.

For this particular sample code, the memory leak isn't a big issue, but for my actual application the memory leak is huge and causes the application to crash almost immediately. Any advice on how I can fix this?

Upvotes: 2

Views: 255

Answers (0)

Related Questions