AKKO
AKKO

Reputation: 1081

How to clear memory completely of all matplotlib plots

I have a data analysis module that contains functions which call on the matplotlib.pyplot API multiple times to generate up to 30 figures in each run. These figures get immediately written to disk after they are generated, and so I need to clear them from memory.

Currently, at the end of each of my function, I do:

import matplotlib.pyplot as plt

plt.clf()

However, I'm not too sure if this statement can actually clear the memory. I'm especially concerned since I see that each time I run my module for debugging, my free memory space keeps decreasing.

What do I need to do to really clear my memory each time after I have written the plots to disk?

Upvotes: 51

Views: 66280

Answers (8)

ElvishPriestley
ElvishPriestley

Reputation: 55

The only way I was able to not bleed memory with pyplot was to make the function that created plots a separate process and for each plot, spawn a new process. Did this with the multiprocessing module. pseudo code bellow:

from multiprocessing import Process
import matplotlib.pyplot as plt

class PyplotWrapper:
    def plot_file(file):
        time, file_data = extract_file_data(file)
        plt.figure(num=1, clear=True, figsize=(12, 8), dpi=400)
        plt.plot(time, file_data)
        plt.savefig(plot_path)
        plt.figure().clear()
        plt.close('all')
        plt.cla()
        plt.clf()

def plot_csv_file(file):
    plot_object = PyplotWrapper()
    plot_object.plot_file(file)

for file in csv_files:
    p = Process(target=plot_csv_file, args=(file,))
    p.start()
    p.join()

Just calling the plot_file() function in the loop alone ate all my memory. So I do not believe I saved memory on calling any of the clf() cla() or clear() functions and having num=1 and clear=True set in the fig() call did not help me either. As there were many samples to plot, the overhead of creating a new process for each figure was not as large as the time to plot.

Upvotes: 1

alec_djinn
alec_djinn

Reputation: 10799

I use plt.close("all") and it works as expected.

Upvotes: -1

Matze
Matze

Reputation: 114

There is unfortunaltely no solution:

See: https://github.com/matplotlib/matplotlib/issues/20300

This only happens if you work in a GUI backend, create new figures, but don't show them. This way of usage is not reasonable. There are working usage patterns for all relevant scenarios.

Upvotes: 2

D A
D A

Reputation: 3438

This is more of a test suite than an answer to the question. Here I show that as of Dec 2021, no provided solution actually clears the memory.

We need the following libraries:

import os
import psutil 
import numpy
import matplotlib
import matplotlib.pyplot

I create a single function which should clear ALL matplotlib memory:

def MatplotlibClearMemory():
    #usedbackend = matplotlib.get_backend()
    #matplotlib.use('Cairo')
    allfignums = matplotlib.pyplot.get_fignums()
    for i in allfignums:
        fig = matplotlib.pyplot.figure(i)
        fig.clear()
        matplotlib.pyplot.close( fig )
    #matplotlib.use(usedbackend) 

I make a script that creates 100 figures then attempts to delete them all from memory:

#Use TkAgg backend because it works better for some reason:
matplotlib.use('TkAgg')

#Create fake data for our figures:
x = numpy.arange(1000)

#Get system process information for printing memory usage:
process = psutil.Process(os.getpid())

#Check memory usage before we create figures:
print('BeforeFigures: ', process.memory_info().rss)  # in bytes

#Make 100 figures, and check memory usage:
for n in range(100):
    matplotlib.pyplot.figure()
    matplotlib.pyplot.plot(x, x)
print('AfterFigures:  ', process.memory_info().rss)  # in bytes

#Clear the all the figures and check memory usage:
MatplotlibClearMemory( )
print('AfterDeletion: ', process.memory_info().rss)  # in bytes

Which outputs memory remaining:

>>> BeforeFigures: 76083200
>>> AfterFigures:  556888064
>>> AfterDeletion: 335499264

Less than half the memory allocated is cleared (much less if using standard back-end). The only working solutions on this stack overflow page avoid placing multiple figures in memory simultaneously.

Upvotes: 5

James Huang
James Huang

Reputation: 571

After one week trials, I got my solution! Hope it can help you. My demo is attached.

import matplotlib.pyplot as plt
import numpy as np

A = np.arange(1,5)
B = A**2

cnt=0
while(1):  
    cnt = cnt+1
    print("########### test %d ###########" % cnt)

    # here is the trick: 
    # set the figure a 'num' to prevent from re-malloc of a figure in the next loop 
    # and set "clear=True" to make the figure clear
    # I never use plt.close() to kill the figure, because I found it doesn't work.
    # Only one figure is allocated, which can be self-released when the program quits.
    # Before: 6000 times calling of plt.figure() ~ about 1.6GB of memory leak
    # Now: the memory keeps in a stable level
    fig = plt.figure(num=1, clear=True)
    ax = fig.add_subplot()

    # alternatively use an other function in one line
    # fig, ax = plt.subplots(num=1,clear=True)

    ax.plot(A,B)
    ax.plot(B,A)

    # Here add the functions you need 
    # plt.show()
    fig.savefig('%d.png' % cnt)

Upvotes: 47

Luis DG
Luis DG

Reputation: 558

Especially when you are running multiple processes or threads, it is much better to define your figure variable and work with it directly:

from matplotlib import pyplot as plt

f = plt.figure()
f.clear()
plt.close(f)

In any case, you must combine the use of plt.clear() and plt.close()

UPDATE (2021/01/21)

If you are using a MacOS system along with its default backend (referred as 'MacOSX'), this does NOT work (at least in Big Sur). The only solution I have found is to switch to other of the well-known backends, such as TkAgg, Cairo, etc. To do it, just type:

import matplotlib
matplotlib.use('TkAgg') # Your favorite interactive or non-interactive backend

Upvotes: 34

Laurent90
Laurent90

Reputation: 293

Late answer, but this worked for me. I had a long sequential code generating many plots, and it would always end up eating all the RAM by the end of the process. Rather than calling fig.close() after each figure is complete, I have simply redefined the plt.figure function as follows, so that it is done automatically:

import matplotlib.pyplot as plt
import copy
try:
   # if script it run multiple times, only redefine once
   plt.old_figure
except:
  # matplotlib is imported for the first time --> redefine
  plt.old_figure = copy.deepcopy(plt.figure)
  def newfig(*args):
    plt.show()
    plt.close("all")
  return plt.old_figure(*args)
  plt.figure = newfig

I am well aware that this is not a nice solution, however it easy, quick, and did the trick for me ! Maybe there's a way to decorate plt.figure instead of redefining it.

Upvotes: -1

Pritesh Gohil
Pritesh Gohil

Reputation: 476

I have data analysis module that contains functions which call on Matplotlib pyplot API multiple

Can you edit your functions which is calling matplotlib? I was facing the same issue, I tried following command but none of it worked.

plt.close(fig)
fig.clf()
gc.collect()
%reset_selective -f fig

Then one trick worked for me, instead of creating a new figure every time, I pass the same fig object to the function and this solved my issue.

for example use,

fig = plt.figure()
for i in range(100):
    plt.plot(x,y)

instead of,

for i in range(100):
    fig = plt.figure()
    plt.plot(x,y)

Upvotes: 7

Related Questions