Reputation: 24825
How can I (as reliably as possible) calculate the time when "above the fold" content is visually complete, including external CSS and fonts being applied and any images loaded.
Apologies there is a lot to this question, I just wanted to show I had worked on the question and where I was at so it didn't end up as a "how do I do this" type question and get insta closed!!
I am attempting to try and work out if all resources required to render "above the fold" content have been fully downloaded in the client browser.
This is part of a bigger goal of simulating SpeedIndex purely using browser APIs (i.e. not using a screenshot timeline).
Firstly this doesn't have to be perfect, it is an approximation, but the more accurate I can make it the better!
The way I am doing it at the moment is using PerformanceObserver
to list all resources as they are downloaded.
The second I get a 2 second window where no requests have been completed I assume critical CSS has been downloaded and start looking at the images on the page.
//getRects is the function to get all of the rectangles above the fold.
var rectInterval = setTimeout(getRects, 2500);
var xx = new PerformanceObserver(function (ll, p) {
ll.getEntries().forEach(function (en) {
if (en.name.indexOf(apiEndpoint) == -1) {
if (en.entryType == "layout-shift") {
if (en.entryType == "resource"){
//any resources I reset the timer waiting for a quiet time
clearTimeout(rectInterval);
rectInterval = setTimeout(getRects, 2000);
}
}
});
});
xx.observe({
entryTypes: ['largest-contentful-paint', 'longtask', 'resource', 'paint', 'navigation', 'mark', 'measure', 'layout-shift', 'first-input']
});
I grab the dimensions of all images and elements with background images on the page in the function getRects
using getBoundingRect()
and then calculate if they appear above the fold using window.innerHeight
etc.
These are the candidates to check when they downloaded (along with the previous list of resources etc.)
var doc = window.document;
var browserWidth = window.innerWidth || doc.documentElement.clientWidth;
var browserHeight = window.innerHeight || doc.documentElement.clientHeight;
function checkRectangle(el){
var rtrn = false;
if (el.getBoundingClientRect) {
var rect = el.getBoundingClientRect();
//check if the bottom is above the top to ensure the element has height, same for width.
//Then the last 4 checks are to see if the element is in the above the fold viewport.
if (rect.bottom <= rect.top || rect.right <= rect.left || rect.right < 0 || rect.left > browserWidth || rect.bottom < 0 || rect.top > browserHeight) {
rtrn = false;
}else{
rtrn = {};
rtrn.bot = rect.bottom;
rtrn.top = rect.top;
rtrn.left = rect.left;
rtrn.right = rect.right;
}
}
return rtrn;
}
//function to get the rectangles above the fold
function getRects(){
var rects = [];
var elements = doc.getElementsByTagName('*');
var re = /url\(.*(http.*)\)/ig;
for (var i = 0; i < elements.length; i++) {
var el = elements[i];
var style = getComputedStyle(el);
if(el.tagName == "IMG"){
var rect = checkRectangle(el);
if(rect){
//The URL is stored here for later processing where I match performance timings to the element, it is not relevant other than to show why I convert the `getBoundingClientRect()` to a simple object.
rect.url = el.src;
rects.push(rect);
}
}
//I also need to check for background images set in either CSS or with inline styles.
if (style['background-image']) {
var rect = checkRectangle(el);
if(rect){
var matches = re.exec(style['background-image']);
if (matches && matches.length > 1){
rect.url = matches[1].replace('"', '');
rects.push(rect);
}
}
}
}
That bit is fine (although any tips to narrow the search so I am not looping over everything would be great) but my problem comes on a slow loading website. If there is more than a 2 second gap between requests (which can happen on a particularly slow connection or if the server is a long way from the user) then I won't get complete data.
My workaround was to then monitor for further network requests (yet again waiting for a 2 second delay between requests) and re-run the function to gather the above the fold content. This obviously does not work well if the site uses lazy loading on scroll as requests can keep firing throughout the page lifecycle.
As gathering the dimensions of the elements can be quite CPU intensive on a very heavy page, coupled with the need to send this data to the server for analysis, I am trying to find a more robust way of ensuring all critical content is loaded. or a way to only fire getRect
once but ensure all initial loading is complete.
Presume that any manipulation of data can be done later on the server if the payload is small enough (less than 1kb say)
<links>
, <scripts>
etc. and checking they have loaded. The problem comes with dynamically added links as well as external resources (i.e. stylesheets linked within another stylesheet). This would probably be more robust but would become very complex.MutationObserver
to monitor the page and yet again waiting for quiet time. However this would get fired more often if the page had any interactivity as far as I can tell?Am I on the right track as a way of solving this conundrum, or is there some easy formula I can use based on window.performance
data (or similar API) that lets me say "all above the fold elements are loaded and rendered."
I hope that is clear but any questions just ask as I know there is a lot in this question to answer simply "how do I check all critical resources have loaded".
Upvotes: 2
Views: 1255