Reputation: 410
How can I run lighthouse for multiple pages/URLs without having to run the lighthouse-ci command multiple times for every single page I want to test/audit?
lighthouse-ci --performance=80 --seo=80 --accessibility=80 --best-practices=80 <url-1>
lighthouse-ci --performance=80 --seo=80 --accessibility=80 --best-practices=80 <url-2>
lighthouse-ci --performance=80 --seo=80 --accessibility=80 --best-practices=80 <url-3>
Upvotes: 3
Views: 7799
Reputation: 3215
Using PowerShell you could call an invoke-webrequest for each endpoint if you are building a solution in Nodejs/ExpressJs> Just make sure the invoke-webrequests are sequentially executed.
Upvotes: 0
Reputation: 495
...chiming in late here, but you're now able to provide a list of urls in your .lighthouserc configs: https://github.com/GoogleChrome/lighthouse-ci/blob/main/docs/configuration.md#url
Example:
module.exports = {
ci: {
collect: {
url: [
'https://www.nytimes.com',
'https://www.nytimes.com/interactive/2023/03/12/sports/natural-selection-snowboarding-travis-rice.html'
],
// ...
Upvotes: 0
Reputation: 1451
I had the exact same issue. So I built a web tool for that: https://qualitycs.dev
It crawls sitemap to find new pages then it runs lighthouse regularly. Issues are listed per page and globally. Meaning that you can directly see site wise issues such as cache, dans, https etc.
I'm actively working on it so new features are coming.
Upvotes: -1
Reputation: 9211
I've had a lot of luck with Lighthouse Parade, another CLI package. It runs a Lighthouse report on all pages discoverable from the url you provide:
npx lighthouse-parade http://www.dfwfreeways.com/
Pro tip: run it at night or give it a limit on the number of pages to run if you're not familiar with how many pages are on a domain :)
Upvotes: 1
Reputation: 914
I came across the same problem and while looking for a good solution came across this nifty little package - lighthouse-batch
All I had to do was run the following by passing URL's separated by a comma:
lighthouse-batch -s https://www.url1.com,https://www.url2.com,https://www.url3.com
You also get the summary of all the sites passed in a single summary.json
file as well as a detailed report for each site under the file site_url.json
Upvotes: 1
Reputation: 1
instead of cli, you can write a program, where you can read urls from a file you can use this for reference https://github.com/gowthamraj198/Lighthouse
Upvotes: 0