Reputation: 48758
I've written a little node app that will scrape a website for its used CSS using puppeteer and Headless Chrome.
It works great, apart from one thing: It doesn't get grab @media
rules?
const puppeteer = require('puppeteer');
const util = require('util');
const fs = require("fs");
(async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.coverage.startCSSCoverage();
await page.goto('http://localhost');
await page.setViewport({width : 320, height : 640});
const css_coverage = await page.coverage.stopCSSCoverage();
console.log(util.inspect(css_coverage, { showHidden: false, depth: null }));
await browser.close();
let final_css_bytes = '';
let total_bytes = 0;
let used_bytes = 0;
for (const entry of css_coverage) {
final_css_bytes = "";
total_bytes += entry.text.length;
for (const range of entry.ranges) {
used_bytes += range.end - range.start - 1;
final_css_bytes += entry.text.slice(range.start, range.end) + '\n';
}
filename = entry.url.split('/').pop();
fs.writeFile('./'+filename, final_css_bytes, error => {
if (error) {
console.log('Error creating file:', error);
} else {
console.log('File saved');
}
});
}
})();
Does anyone have any idea why @media
rules are not included in the final output, when the page has lots of them?
Upvotes: 0
Views: 2103
Reputation: 48758
I reported this as a bug in Chrome, and it's been moved to Chromium:
https://bugs.chromium.org/p/chromium/issues/detail?id=983887
Hopefully one day soon this feature will be added.
Upvotes: 0
Reputation: 73
I believe this is the behavior with chromium itself. If you checked chrome dev tool's code coverage, the used css would not contain the definitions for media-queries. Only the actual selectors will be available.
The same applies for font-faces and keyframes. There is even an issue in chromium for font-face.
To extract out the css used in a page, you can check out something like minimalcss
Upvotes: 1