aforbes
aforbes

Reputation: 143

How to clean up all Selenium Firefox Processes

I've created a web scraper with python (3.6) and a selenium, firefox web driver. I've set up a cronjob to run this scraper every few minutes, and it seems to all be working, except that over time (like a few days), the memory on my Ubuntu VPS (8GB RAM, Ubuntu 18.04.4) fills up and it crashes.

When I check HTOP, I can see lots (as in, hundreds) of firefox processes like "/usr/lib/firefox -marionette" and "/usr/lib/firefox -contentproc", all taking up about 3 or 4mb of memory each.

I've put a

browser.stop_client() browser.close() browser.quit()

In every function that uses the web driver, but I suspect the script is sometimes leaving web drivers open when it hits an error, and not closing them properly, and these firefox processes just accumulate until my system crashes.

I'm working on finding the root cause of this, but in the meantime, is there a quick way I can kill/clean up all these processes?

e.g. a cronjob that kills all matching processes (older than 10 minutes)?

Thanks.

Upvotes: 5

Views: 2602

Answers (1)

Naveen
Naveen

Reputation: 788

I suspect the script is sometimes leaving web drivers open when it hits an error, and not closing them properly

This is most likely the issue. We fix this issue by using try except finally blocks.

browser = webdriver.Firefox()
try:
    # Your code
except Exception as e:
    # Log or print error
finally:
    browser.close()
    browser.quit()

And if you still face the same issue, you can force kill the driver as per this answer, or this answer for Ubuntu.

import os
os.system("taskkill /im geckodriver.exe /f")

Upvotes: 4

Related Questions