Reputation: 139
The scraper I wrote runs perfectly on my PC, A windows OS that runs node.js v14.4.0.
But when I have tried to run it on Digital Ocean Droplet, Ubunto machine, I get for some of the pages the following error: Page crashed!
with not much information.
Here is the code for printing the error:
const handleClose = async (msg) =>{
console.log(msg)
page.close();
browser.close();
process.exit(1);
}
process.on("uncaughtException", (reason, p) => {
const a = `Possibly Unhandled Exception at: Promise , ${p}, reason: , ${reason}`
handleClose(a);
});
How do I tackle this one? And what could cause it? as it works wonderfull on my Windows PC.
Upvotes: 3
Views: 3365
Reputation: 139
I have added all memory configurations that I found online and related:
const args = [
'--no-sandbox',
'--disable-setuid-sandbox',
'--disable-infobars',
'--window-size=1366,768',
'--unlimited-storage',
'--full-memory-crash-report',
'--disable-dev-shm-usage',
'--force-gpu-mem-available-mb',
'--disable-gpu'
]
But that didn't help.
Thanks to pguardiario note, I simply upgrade Droplet from 1G RAM to 2G. And that did the trick.
I find it strange that to scrape a simple website it takes more than 1G, so I guess Puppeteer takes a lot of resources to run.
UPDATE I had anther page crush, but this time it was related to the server utilazing all the memory. So I removed all this arges from Puppeteer:
'--unlimited-storage',
'--full-memory-crash-report',
'--disable-dev-shm-usage',
'--force-gpu-mem-available-mb',
'--disable-gpu'
And where left only with the basic ones:
const args = [
'--no-sandbox',
'--disable-setuid-sandbox',
'--disable-infobars',
'--window-size=1366,768'
]
And it's now stable. So, I guess this needs to be used carefully and removed if not really needed.
Upvotes: 2