The performance report in Google Lighthouse made the following recommendation:
Avoid multiple, costly round trips to any origin
"Avoid multiple, costly round trips to any origin"...It then proceeds to list 8 origins.
So, I added the following code to the top of the <head> section of my site, where the hrefs correspond to each origin.
<link rel='preconnect' href='https://connect.facebook.net' />
<link rel='preconnect' href='https://img.secureserver.net' />
<link rel='preconnect' href='https://advertise.bingads.microsoft.com' />
<link rel='preconnect' href='https://advertiseonbing.blob.core.windows.net' />
<link rel='preconnect' href='https://www.gstatic.com' />
<link rel='preconnect' href='https://js.calltrk.com' />
<link rel='preconnect' href='https://px.ads.linkedin.com' />
<link rel='preconnect' href='https://www.facebook.com' />
<link rel='preconnect' href='https://accounts.google.com' />
I also tried them with <link rel='dns-prefetch' /> and some combination of that and other browser hints.
However, nothing changed in the Lighthouse reports after running it again some time later. Could someone steer me in the right direction so I can resolve that issue in Lighthouse and get one step closer to having a perfect report?!
Edit: I have a guess for why doing this might not make a difference in the Lighthouse Report, and it's that the resources are called in the document before the preconnect has even finished. So even though it technically starts a little sooner and saves some ms, the rest of the document is so small and it requests those resources before the prefetch or preconnect is even completed. Sound logical?
Related
Suppost I have this index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="prefetch" as="image" href="./assets/footer.jpg" />
<script
async
type="text/javascript"
src="https://code.jquery.com/jquery-3.3.1.js"
></script>
<link rel="stylesheet" href="./style.css" />
</head>
<body>
<script src="./utils.js"></script>
</body>
</html>
According to this doc
async script has Low priority, and css has Highest priority, so why does the jquery-3.3.1.js was downloaded before style.css downloaded when I Ctrl+F5 to refresh the whole page? (And you can see footer.jpg was at the end due to Lowest priority because of prefetch, but why async script not works in the same way?)
click me to see the result
Could anyone tell me what's going wrong? My Chrome version: 107.0.5304.107
To me, the reasonable order should be:
localhost (Highest)
style.css (Highest)
utils.js (High)
jquery-3.3.1.js (Low)
footer.jpg (Lowest)
Priority is not the order to be downloaded
Priority is the order to start downloading if there's many resources
Chrome generally uses 5 download threads, so if there's less then 6 resources they start downloading at once, and generally finish downloading depending on their size
That may be different only on very slow (<10kb/s) networks, try reducing web speed in devtools to check what happens
#Dimava and #Daniel W., Thank you very much for your kind answers , I think I have found the answer :)
According to the doc here:
Chromium loads resources in 2 phases. “Tight mode” is the initial phase and constrains loading lower-priority resources until the body is attached to the document (essentially, after all blocking scripts in the head have been executed). In tight mode, low priority resources are only loaded if there are less than 2 in-flight requests at the time that they are discovered.
Because when html parser see the async script for jquery, there's no in-flight requests, so it was started downloading at once even if it has low priority, thus started downloading before style.css was.
Consider another situation:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="prefetch" as="image" href="assets/footer.jpg" />
<script src="utils.js"></script>
<script src="another.js"></script>
<script
async
type="text/javascript"
src="https://code.jquery.com/jquery-3.3.1.js"
></script>
<link rel="stylesheet" href="style.css" />
</head>
<body>
</body>
</html>
And the result is here, we can see that when html parser see the async script for jquery, there're two in-flight requests (for utils.js and another.js), so it is downloaded at the time body is attached to the document , thus started downloading after style.css was.
I've had a problem with Facebook.com for a few days now. Every time I load the website # https://www.facebook.com it will just take a few seconds and then display a white page. The source of the page is this:
<!DOCTYPE html>
<html lang="en" id="facebook" class="no_js">
<head><meta charset="utf-8" /><script>function envFlush(a){function b(c){for(var d in a)c[d]=a[d];}if(window.requireLazy){window.requireLazy(['Env'],b);}else{window.Env=window.Env||{};b(window.Env);}}envFlush({"ajaxpipe_token":"AXgRWhMTFIL_juHq","lhsh":"KAQF0BXye","khsh":"0`sj`e`rm`s-0fdu^gshdoer-0gc^eurf-3gc^eurf;1;enbtldou;fduDmdldourCxO`ld-2YLMIuuqSdptdru;qsnunuxqd;rdoe-0unjdojnx0"});</script><script>CavalryLogger=false;</script><noscript><meta http-equiv="refresh" content="0; URL=/?_fb_noscript=1" /></noscript><meta name="referrer" content="origin-when-crossorigin" id="meta_referrer" /><link type="text/css" rel="stylesheet" href="https://static.xx.fbcdn.net/rsrc.php/v2/yf/r/rTRM3thxxyG.css" data-bootloader-hash="nVu0I" data-permanent="1" crossorigin="anonymous" />
<link type="text/css" rel="stylesheet" href="https://static.xx.fbcdn.net/rsrc.php/v2/yX/r/h3ydpK_uuri.css" data-bootloader-hash="zp4CG" data-permanent="1" crossorigin="anonymous" />
<link type="text/css" rel="stylesheet" href="https://static.xx.fbcdn.net/rsrc.php/v2/ym/r/T7fKvPc0GuV.css" data-bootloader-hash="2S8io" data-permanent="1" crossorigin="anonymous" />
<link type="text/css" rel="stylesheet" href="https://static.xx.fbcdn.net/rsrc.php/v2/yu/r/Ld1C_Jcgl5j.css" data-bootloader-hash="/JEly" data-permanent="1" crossorigin="anonymous" />
<script src="https://static.xx.fbcdn.net/rsrc.php/v2/yW/r/m6m6Y7RpsEs.js" data-bootloader-hash="5WJdo" crossorigin="anonymous"></script>
<script>require("TimeSlice").guard(function() {(require("ServerJSDefine")).handleDefines([["CSSLoaderConfig",[],{"timeout":5000,"loadEventSupported":true},619]]);new (require("ServerJS"))().handle({"require":[["Bootloader","loadEarlyResources",[],[{"RmDcU":{"type":"js","src":"https:\/\/static.xx.fbcdn.net\/rsrc.php\/v2\/yL\/r\/uKsBDyzJwJh.js","crossOrigin":1},"5zC\/8":{"type":"js","src":"https:\/\/static.xx.fbcdn.net\/rsrc.php\/v2\/y0\/r\/64jGxSfxJ36.js","crossOrigin":1},"pFHnJ":{"type":"js","src":"https:\/\/static.xx.fbcdn.net\/rsrc.php\/v2\/yp\/r\/K6ojr4ngQRr.js","crossOrigin":1},"P0eje":{"type":"js","src":"https:\/\/static.xx.fbcdn.net\/rsrc.php\/v2\/y2\/r\/Wn1AFwUyuPt.js","crossOrigin":1},"q\/AOh":{"type":"js","src":"https:\/\/static.xx.fbcdn.net\/rsrc.php\/v2\/yn\/r\/r40PYeaPQsy.js","crossOrigin":1}}]]]});}, "ServerJS define", {"root":true})();</script>
I tried deleting all cookies, history, empty cache etc. and that allowed me to get to the login screen. When I login using my account it gives me the white screen, however I created a dummy account for this problem and logged in using a new account, completely fresh and it worked fine...
Not too sure what's going on here since it seems to be account specific, and it's not like I've misused the site... Anyone have any insight into this?
Or just answer the question with a fix and be a little more human and compassionate. The fix by the way is simple - create a business FB profile at business.facebook.com, using the same email address as the affected FB account. Once in on that, change views to your personal profile and your account should be back working normally.
You will need to submit the code to Facebook as its an issue accessing their server. That's a timeout, try logging in with a VPN set from a different area. Sometimes going from Instagram to Facebook cross-platform will cause a similar issue is an authentication token isn't given in a timely manner.
If that doesn't work report the code to Facebook so they can fix the timeout error.
The favicon isn't working on my site. If I go to google.com/favicon.ico I see the image displayed. But for some reason on my site, lucasjohnson.co.uk/favicon.ico prompts me to download a file. I have even tried replacing my own favicon with Google's, but I still have the same problem.
Edit: The file ico file was converted from a png using Dynamic Drive's FavIcon Generator.
I checked http://www.lucasjohnson.co.uk/favicon.ico, and the response content-type is text/plain. There are several content types that can be used for favicon, but text/plain is not one of them.
The most common ones are image/x-icon and image/vnd.microsoft.icon.
So basically, just choose one of the following content types and add it to your link tag:
<link rel="icon" type="image/vnd.microsoft.icon" href="http://www.lucasjohnson.co.uk/favicon.ico" />
<link rel="icon" type="image/png" href="http://www.lucasjohnson.co.uk/favicon.ico" />
<link rel="icon" type="image/gif" href="http://www.lucasjohnson.co.uk/favicon.ico" />
<link rel="icon" type="image/x-icon" href="http://www.lucasjohnson.co.uk/favicon.ico" />
See http://en.wikipedia.org/wiki/Favicon for more details.
BTW, regardless (or not) of this problem, I've noticed that you're not closing one of your meta tags:
<meta name="viewport" content="width=device-width">
I know this is a pretty common problem but any of the solutions I have tried (thats a lot) haven't worked.
I am trying to scrape http://residencyradio.com/ using https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fresidencyradio.com%2F
The site itself is getting a complete overhaul, this will be revealed next week and I want the relevant images, title and info to appear if someone links to the site, but instead at the moment these properties are being shown as they were the very first time the site was cached on FB (nearly a year ago).
As far as I can see, I have included all the relevant meta tags etc how they should be, I even tried implementing a like button on the site, but to no avail. I have followed what has been set out on: http://ogp.me/ and can't see anything wrong.
Here is a sippet of the page from !DOCTYPE to </head>:
<!DOCTYPE html>
<html lang="en" prefix="og: http://ogp.me/ns#">
<head>
<meta charset=utf-8>
<title>The Residency</title>
<!--[if IE]>
<script src="http://html5shiv.googlecode.com/svn/trunk/html5.js">
</script>
<![endif]-->
<link href="css/reset.css" rel="stylesheet" type="text/css" />
<link href="css/stylesheet.css" rel="stylesheet" type="text/css" />
<link rel="icon" type="image/png" href="images/favicon.png" />
<!--Meta Data-->
<meta name="keywords" content="The Residency, M. Budden, Neal McClelland, Michael Budden, Radio, Residency Radio,
Residency, Global, House, Electro, Progressive, Tech, Techno, DnB, Drum and Base, Dubstep, iTunes, Belfast,
Northern Ireland, UK" />
<meta name="description" content="Brought to you by Neal McClelland and M. Budden, The Residency is a weekly global underground dance show" />
<meta property="og:title" content="The Residency A global underground dance show" />
<meta property="og:type" content="musician" />
<meta property="og:url" content="https://www.facebook.com/theresidency" />
<meta property="og:image" content="https://www.residencyradio.com/images/Residency_logo_circle.png" />
<meta property="og:site_name" content="The Residency" />
<meta property="fb:admins" content="1324839758" />
Any help would be greatly appreciated, as I've been scratching my head for a few days trying to figure it out!
Thanks in advance!
This is a guess, but your html is not valid and maybe because of that the facebook scraper fail to parse and extract the data from it.
I haven't went through all of it, but you don't seem to close all tags.
For example the description and keywords meta tags don't end with "/>" or ">".
Edit
Screen capture of what the debugger shows when I load your html from my server:
Its been a couple of weeks that some sites just keeps on hanging.
e.g.
Facebooks => static.ak.fbcdn.net
FLicker => l.yimg.com
GoogleAnalytics
I've googled and found many problems like this and some answers which are outdated or just doesn't solve the problem.
I did:
Cookies clearing, ran cc cleaner and several other nifty methods. None solved my problem??
Only with facebook, if I enter https:// manually instead of http:// on every url on facebook, it works and when it does the redirection to http://, everytime I have to type 's' on the address bar to make it https://
It is driving me nuts coz I'm developing Facebook App and this problem in being pain in my ass.
What might be the reason for these CDNs hanging behaviour??
Update: Mon Feb 8, 2010
Well when I viewed the source with firefox, this is the header part:
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zDO0B/hash/8jpbog60.css" />
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/zA96O/hash/8jqnsh63.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/z9X8U/hash/5zy5e7ns.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/z7XWB/hash/b881ctjq.css" />
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/zEMLE/hash/6n3druoq.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zEEQQ/hash/3et16vbl.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zF0BN/hash/4ey03a8b.css" />
#<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zD46U/hash/4ctxkmr7.css" />
<script type="text/javascript" src="http://b.static.ak.fbcdn.net/rsrc.php/z5KPU/hash/f92tjc5l.js"></script>
When I clicked the each link, all links open with its contents except the last link with -# sign prefixed.
So the url -#http://b.static.ak.fbcdn.net/rsrc.php/zD46U/hash/4ctxkmr7.css is not opening and this css file is not downloaded and the facebook page looks horrible and all left aligned??
Update: Tue Feb 9, 2010
Today the link with the -# sign is just keeps hanging and looping:
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/zEMLE/hash/6n3druoq.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/z9X8U/hash/5zy5e7ns.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zF0BN/hash/4ey03a8b.css" />
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/z1580/hash/4l5utauj.css" />
#<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/z4851/hash/532htj7z.css" />
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/z1GEW/hash/dh01t0zv.css" />
<link type="text/css" rel="stylesheet" href="http://static.ak.fbcdn.net/rsrc.php/z80UK/hash/3a6o59ih.css" />
<link type="text/css" rel="stylesheet" href="http://b.static.ak.fbcdn.net/rsrc.php/zD46U/hash/4ctxkmr7.css" />
<script type="text/javascript" src="http://b.static.ak.fbcdn.net/rsrc.php/z5KPU/hash/f92tjc5l.js"></script>
Why that url http://b.static.ak.fbcdn.net acting weird? Has something Akamai got to do with this?
It could be some kind of connectivity issue between you and the CDNs. Blocking them with adblock (or the hosts file) is an effective way forward, or there's a Firefox extension for it - Ghostery.
This could also be an issue with your ISP. I have seen cases in which certain domains take arduous amounts of time to access or, in some cases, cannot be accessed at all during peak hours. It could be possible that the domains you're having issues with are ones that your ISP is having problems with. In the past I've experienced outages of Google as well as one or tow subdomains.
Do you have the same problem from different computers on the same network?
I also have the same problem. I've not resolved the issue completely but I found a hack fix:
Set the CDN domains to fixed IP numbers in your local computer's hosts file.
I added these lines to my WinXP c:\windows\system32\drivers\etc\hosts
Other OS's have a similar file(Linux: /etc/hosts)
# CDN networks broken for Yahoo, Google and Facebook
217.212.252.78 static.ak.fbcdn.net
217.212.252.78 profile.ak.fbcdn.net
217.212.252.78 external.ak.fbcdn.net
217.212.252.78 creative.ak.fbcdn.net
217.212.252.78 platform.ak.fbcdn.net
217.212.252.78 l.yimg.com
To choose the fastest IPs for your location visit just-ping.com .
I've not tried it but this trick might also work for Windows users:
http://www.updatexp.com/dns-windows-xp.html
I observed this same issue on several computers on our home LAN over a prolonged period and after many months, finally found a fix. It turned out that the intrusion detection rules in my Billion ADSL router were overly aggressive in blocking sites. This discussion thread revealed the issue and solution for my case.
http://forums.whirlpool.net.au/archive/1622370
You may find that your router has similar issues or that your firewall performs a similar intrusion detection function that needs tuning.
If you have a similar router or firewall feature, you can test if this is the case by clearing the 'intrusion detection' blacklist when you observe the blocking problem to see if it unblocks the site.
If you have a Billion modem/router, I'd also recommend updating to the latest firmware that now includes the ability to add white list IPs that will permanently prevent IP blocking. This may or may not be helpful for CDN sites like Facebook that use large server farms (= numerous IPs per domain name). What I think Billion need to do here is support domain name white listing rather than just IP white listing so that the router determines which IP you've been allocated for this domain.