How to find total DOM processing and page rendering time in chrome - dom

How to find total DOM processing and page rendering time in chrome for a particular page (running locally) .. i want to see the total time for the entire page and not individual resources in the page.

Use the Timeline function in the developer tools in Chrome. Alternativly you can also use the Profiler if you want to watch memory usage or the Network tab to view downloading and rendering of the page.

Related

Need Help: Can you identify the Desktop CLS (Cumulative Layout Shift) on my URL?

URL: tinyurl.com/36utmdnc
In Google Search Console, I am seeing thousands of errors for Core Web Vitals, particularly the CLS in Desktop mode. I've made a number of changes, such as removing all ads, tweaking the CSS, HTML and other code. I've been making changes for a year actually, but I've really tried tackling this problem over the past 2-3 months, since "Page Experience" is showing a big reduction in "good URLs" and I believe it's now affecting my traffic volume from Google Search. It's been weeks now and the CLS is not changing in Search Console. I've tried validating for a couple of months now.
The example URL above (a product page) has a CLS score of 0.33 according to Google Search Console / Core Web Vitals. It seems all of the errors are on my product pages like the example above. I've ran tests in PageSpeed Insights, which shows a CLS of "0". I understand the reports shown in Google Search Console are from "Chrome User Experience report", which is different than PageSpeed Insights "lab setting".
Here are things I've done:
Opened Chrome Developer Tools and clicked the checkbox for "Layout Shift Regions", which flash blue during a layout shift. Also checked the "Core Web Vitals" to enable to overlay that adds up CLS throughout the entire browsing experience.
Opened the Network Tab and throttled the speed down to "very slow tests" and carefully watched for layout shifts while the page loads slowly.
Carefully read the guides on web.dev/cls/ , ran Chrome's LightHouse test, ran tests on various sites like webvitals.dev/cls , defaced.dev/tools/layout-shift-gif-generator/ , webpagetest.org/webvitals, etc.
Manually tested using different screen resolution widths/heights using Chrome Developer Tools (800px width, 1400px width, 8000px height, etc).
Asked tech-savvy friends/users to also check and help identify the CLS.
I can't find anything that could cause a large CLS of 0.33. The number of CLS errors in Search Console is staying steady, going up and down by about 100 URLs every day, but the same example URL above has been stuck there for months. So I was hoping someone with knowledge could find it or identify the underlying issue.
Thanks
I cannot see a CLS score of 0.33 for that page, and agree that testing it myself shows very little CLS.
You are correct that PageSpeed Insights is lab-based, but it also shows the field-based data at the top, and in this case it is saying it does not have enough field-based data for this URL and so is displaying the origin-level data for all the pages for your site.
Google Search Console also shows field-based data, but groups pages it thinks might be similar and gives them all the same CLS scores. This is done at a bit more lower-level grouping than the whole origin so it may group all your product pages together for example, if it has data for some of them.
This page for example has a hover effect that gives a huge CLS when it's used - though can't quite see why as finding it difficult to trigger manually in dev tools.
It is possible those pages are the ones with the actual CLS issue and they are being lumped in with your good page under the product pages group. I would investigate if you can reduce the CLS on that hover effect.
Separately, I notice you are lazy loading your images but not specifying width and height on them. This can lead CLS if the images are not loaded by the time that area scrolls into view for example this test when linking to a bit at the bottom of the page. It is recommended to always include image dimensions on lazy-loaded images (and in fact on all images!) to avoid this. This could be another reason for your high CLS, which is not as evident in lab-based tools that typically only load the top of the page without lazy-loaded content.
I also recommend you read the Debugging Web Vitals in the Field and also the more advanced Measure and debug performance with Google Analytics 4 and BigQuery for additional information as to how to answer why your field-based metrics may be different than what you can observe.

Is there a way to view all images loaded in the browser using Chrome Developer Tools?

I know that I am supposed to be able to see this in the "resources" tab -section of Chrome developer tools. Maybe I am not getting something, but I don't see any images listed in this section.
Does anyone know if / how this can be accomplished?
Open Google Chrome Console (F12)
Go to Network tab
Enable Filter, if it’s not enabled
Select the "Img" tag to filter for image requests
Refresh the page to see a list of all images as they are requested
DevTools -> Application tab -> Frames in the sidebar -> Images
Contains all the images.
Images are listed specific to each frame. Normally there is only 1 frame on the page. If the page uses the tag there will be others.
Expand "Frames" and then the first item listed (which is the page), below this is the category of images, as well as scripts and stylesheets. Note that the final item is the html of the page itself.
Another way is to use new resource-type filter is available in the Network panel (Chrome 87).
For images just type resource-type:image to focus on the network requests that are images.
Chrome Docs: https://developers.google.com/web/updates/2020/10/devtools#network-filters
If you want to download images :
https://chrome.google.com/webstore/detail/download-all-images/ifipmflagepipjokmbdecpmjbibjnakm
use site sucker app for mac or httrack website copier windows to get all contetn from page you want then select or sort data how you need ))
Hope i helped you ;)

Mysterious severe performance issue on mobile Safari for just one web page

I have a very large (as in feature-rich) responsive website. It consists of over 150 different UI pages, and so far both rendering and performance on mobile are fine (I'm using an iPhone5 to test, and occasionally other devices).
Except for one page, which I am coding now. Here's the temporary dev URL:
http://www.jungledragon.org/apps/jd3/daylight
On Mobile Safari, this page performance extremely poorly:
- It takes several seconds to load, much slower than all other pages
- Once loaded, a touch scroll can take 5-10 secs to do anything
- Mobile Safari as a whole becomes non responding or close to it
I'm trying to troubleshoot the root cause of the issue, but no luck so far. I cannot reproduce this on any desktop browser using a small viewport, not even on desktop Safari. On the desktop, I've inspected several web debuggers to check for any long-running processes, but found none.
Some explanation on what the page does:
It will try to detect your current location (using alerts I discovered this takes little time)
Based on your current location and the current date, it will calculate the sun times for the day. This too is nearly instant
Based on the suntimes, it will dynamically generate a table, and then finally show it on screen
Here's the what I am seeing in detail on mobile Safari:
The server response is fine, the page loads quickly and shows the site header soon
Next, the content body is blank and stays blank for several seconds (which I cannot explain)
Finally, the suntimes table renders.
This completes the page, yet as of this point, the page as well as the browser are extremely sluggish, scrolling takes forever, and Safari controls are nearly irresponsive. It looks and feels as if the browser can crash any moment.
Based on my research so far, and given fine performance in all other pages on the site, I'm totally in the dark on what causes this.
Edit: Using BrowserStack I did some more tests:
iPhone 4S: no issues
iPhone 5S: no issues
Galaxy SII: no issues
HTC One X: no issues
iPhone 5: same issue as above
So I'm not seeing the issue on any desktop browser, and on no mobile device except for the iPhone 5 (iOS7).
Edit2: adding more findings and explanation based on comments received:
The issue does not seem animation-related. For this I have a number of proof points. A simple proof point is the page does not do any visual rendering that is much different from any of the other 100+ pages on the site which have no performance issue.
The 2nd proof point can be explained by understanding what is going on in this specific page. What happens is this:
The system will detect the current user's time and location. For now assume that the user actually allows location sharing. Using a simple alert, I've been able to proof that location detection is not the bottleneck.
Based on the user's time and location, the daylight periods are calculated. This is done by using the Suncalc JS library (https://github.com/mourner/suncalc).
The Suncalc library returns an array of daylight periods for the given date and location. I render that array as a table with colored background rows. That is all.
Rendering a table with 12 rows and different background colors is not likely to cause such enormous issues. My theory therefore lies in step 2 being the root cause. The Suncalc library has a lot of advanced math in it. I am thinking (without evidence yet) that either my mobile processor is horrible at those kind of operations, and/or the specific calculation for some reason cause a peak in memory usage (or even a leak).
As an additional proof point: once the page is loaded on mobile, use the right arrow next to the date to navigate to "tomorrow". Again you will see the extremely bad performance. During that step, there is no network activity, no location detection, nothing, just calculations and some very simple rendering. This validates my theory that perhaps the issue lies in the calculation.
Sadly, it looks like native Javascript profilers on that platform are non-existent. You may also want to try the Javascript Microtime function referenced in this answer. You will need to seed your script with calls at points where you think the bottleneck might be.
Just ran this through Chrome remote debugger (https://developers.google.com/chrome-developer-tools/docs/remote-debugging) on my S3, and it looks like Modernizr's cancelZoom function (showing up in jd3_0006.js) is getting called recursively too many times or by too broad a selector. I've uploaded the profiles into dropbox: https://www.dropbox.com/s/kubxk44smm6qqkx/jungledragon_debug..zip
You can import them into Chrome's debugger on the "Profiles" tab.
I believe your performance problem centers around the use of navigator.geolocation.getCurrentPosition() in your runMap() function
if (urlDate != null) {
urlPos(latitude,longitude);
} else {
if (navigator.geolocation) {
$(".img-loading").show(100);
navigator.geolocation.getCurrentPosition(successPos, errorPos{maximumAge:600000,timeout:10000});
} else {
errorPos('');
}
}
Consider using watchPosition() instead with a callback which will not halt processing of the script thread. You can cancel the watchPostion() update by using clearWatch()
So I've played with this some more, and ran the "Timeline" feature on Chrome (load this file into your chrome timeline tool: https://www.dropbox.com/s/2vpl6z1ntuk3aqj/TimelineRawData-20140328T105820.json), and it looks like this might be your main problem.
Your scripts and libs (including loading Google Maps and jQuery) are getting evaluated AFTER parsing the HTML and running Google Analytics because they are at the bottom of the body, not head. Unless you have a very good reason to do that, I would recommend moving those to the head.
There seems to be a separate problem with scrolling, but perhaps it will be resolved by this change.

Facebook Comments - Loading Graphic While Waiting

I'm using the Facebook Comments plugin for a local travel website in an area with poor internet. When the net is slow often the comments plugin takes forever or does not load at all. When it doesn't load it is just blank space instead which makes the page look odd. I want users to know that there should be content there.
Any ideas for how I could display a LOADING graphic while waiting for the Facebook Comments plugin to load? And then maybe if it hasn't after 10 or 15 seconds it returns an error message in that space instead.
I think a possibility might lie in the #Comments value that Facebook returns. Maybe a graphic could be loaded until that value is returned?
Found a solution elsewhere - maybe it will help someone else with a similar problem:
Just replace this snippet with the current one you have where you want the count to appear,
Place the gif image in the same folder as the html file. Or change the path in the src="" to the correct path on your server.
You should see a loading animation while the count of comments from facebook updates.

Remove Facebook Likebox on smaller screen / mobile devices

Is there a method to not display and important not load the Facebook Likebox on devices with a certain screen size (for example: not on smartphones) or certain data connection (not wifi), the likebox slows down the website significantly.
So the method display:none; is no option.
Currently there's no way to (reliably) query for data connection type.
May I suggest a different approach? perhaps you could load the likebox on demand using javascript, the way techcrunch.com or enter.co do it. Those sites load the social buttons only when you hover over some placeholder images.
This way, you optimize not only for mobile screens but for all devices. If you still want, you could check the viewport size before loading the buttons, or use display:none inside a media query so the placeholder images are not shown –and the buttons can't be loaded.
Why not a simple echo based on a conditional ?
For connection you would get the users speed perhaps using a bit of code like this: http://www.emanueleferonato.com/2006/05/31/determine-connection-speed-with-php/
Then an if statement if the speed is above say 56k, output the likebox code?
For mobiles, you would do the same after getting the browsers user agent.