How to prevent favicon.ico requests? - favicon

I don't have a favicon.ico, but my browser always makes a request for it.
Is it possible to prevent the browser from making a request for the favicon from my site? Maybe some META-TAG in the HTML header?

I will first say that having a favicon in a Web page is a good thing (normally).
However it is not always desired and sometime developers need a way to avoid the extra payload. For example an IFRAME would request a favicon without showing it.
Worst yet, in Chrome and Android an IFRAME will generate 3 requests for favicons:
"GET /favicon.ico HTTP/1.1" 404 183
"GET /apple-touch-icon-precomposed.png HTTP/1.1" 404 197
"GET /apple-touch-icon.png HTTP/1.1" 404 189
The following uses data URI and can be used to avoid fake favicon requests:
<link rel="shortcut icon" href="data:image/x-icon;," type="image/x-icon">
For references see here:
https://github.com/h5bp/html5-boilerplate/issues/1103
https://twitter.com/diegoperini/status/4882543836930048
UPDATE 1:
From the comments (jpic) it looks like Firefox >= 25 doesn't like the above syntax anymore. I tested on Firefox 27 and it doesn't work while it still work on Webkit/Chrome.
So here is the new one that should cover all recent browsers. I tested Safari, Chrome and Firefox:
<link rel="icon" href="data:;base64,=">
I left out the "shortcut" name from the "rel" attribute value since that's only for older IE and versions of IE < 8 doesn't like dataURIs either. Not tested on IE8.
UPDATE 2:
If you need your document to validate against HTML5 use this instead:
<link rel="icon" href="data:;base64,iVBORw0KGgo=">

Just add the following line to the <head> section of your HTML file:
<link rel="icon" href="data:,">
Features of this solution:
100% valid HTML5
very short
does not incur any quirks from IE 8 and older
does not make the browser interpret the current HTML code as favicon (which would be the case with href="#")

You can use the following HTML in your <head> element:
<link rel="shortcut icon" href="#" />
I tested this on a forced full refresh, and no favicon requests were seen in Fiddler. (tested against IE8 in compat mode as IE7 standards, and FF 3.6)
Note: this may download the html file twice, so while it works in hiding the error, it comes with a cost.

You can't. All you can do is to make that image as small as possible and set some cache invalidation headers (Expires, Cache-Control) far in the future. Here's what Yahoo! has to say about favicon.ico requests.

if you use nginx
# skip favicon.ico
#
location = /favicon.ico {
access_log off;
return 204;
}

Put this into your HTML head:
<link rel="icon" href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAADElEQVQI12P4//8/AAX+Av7czFnnAAAAAElFTkSuQmCC">
This is a bit larger than the other answers, but does contain an actually valid PNG image (1x1 pixel white).

The easiest way to block these temporarily for testing purposes is to open up the inspect page in chrome by right-clicking anywhere on the page and clicking inspect or by pressing Ctrl+Shift+j and then going to the networking tab and then reloading the page which will send all the requests your page is supposed to make including that annoying favicon.ico. You can now simply right click the favicon.ico request and click "Block request URL".
All of the above answers are for devs who control the app source code. If you are a sysadmin, who's figuring out load-balancer or proxying configuration and is annoyed by this favicon.ico shenanigans, this simple trick does a better job. This answer is for Chrome, but I think there should be a similar alternative which you would figure out for Firefox/Opera/Tor/any other browser :)

You can use .htaccess or server directives to deny access to favicon.ico, but the server will send an access denied reply to the browser and this still slows page access.
You can stop the browser requesting favicon.ico when a user returns to your site, by getting it to stay in the browser cache.
First, provide a small favicon.ico image, could be blank, but as small as possible. I made a black and white one under 200 bytes. Then, using .htaccess or server directives, set the file Expires header a month or two in the future. When the same user comes back to your site it will be loaded from the browser cache and no request will go to your site. No more 404's in the server logs too.
If you have control over a complete Apache server or maybe a virtual server you can do this:-
If the server document root is say /var/www/html then add this to /etc/httpd/conf/httpd.conf:-
Alias /favicon.ico "/var/www/html/favicon.ico"
<Directory "/var/www/html">
<Files favicon.ico>
ExpiresActive On
ExpiresDefault "access plus 1 month"
</Files>
</Directory>
Then a single favicon.ico will work for all the virtual hosted sites since you are aliasing it. It will be drawn from the browser cache for a month after the users visit.
For .htaccess this is reported to work (not checked by me):-
AddType image/x-icon .ico
ExpiresActive On
ExpiresByType image/x-icon "access plus 1 month"

A very simple solution is put the below code in your .htaccess. I had the same issue and it solve my problem.
<IfModule mod_alias.c>
RedirectMatch 403 favicon.ico
</IfModule>
Reference: http://perishablepress.com/block-favicon-url-404-requests/

Elaborating on previous answers, this might be the shortest solution from the HTML file itself:
<link rel="shortcut icon" href="data:" />
Tested working, no error messages or failed requests on Chrome Version 94.0.4606.81

Just make it simple with :
<link rel="shortcut icon" href="#" type="image/x-icon">
It displays nothing!!!!

In Node.js,
res.writeHead(200, {'Content-Type': 'text/plain', 'Link': 'rel="shortcut icon" href="#"'} );

Personally I used this in my HTML head tag:
<link rel="shortcut icon" href="#" />

I need prevent request AND have icon displayed i.e. in Chrome.
Quick code to try in <head>:
<link rel="icon" type="image/png" sizes="16x16" href="data:image/png;base64,
iVBORw0KGgoAAAANSUhEUgAAABAAAAAQBAMAAADt3eJSAAAAMFBMVEU0OkArMjhobHEoPUPFEBIu
O0L+AAC2FBZ2JyuNICOfGx7xAwTjCAlCNTvVDA1aLzQ3COjMAAAAVUlEQVQI12NgwAaCDSA0888G
CItjn0szWGBJTVoGSCjWs8TleQCQYV95evdxkFT8Kpe0PLDi5WfKd4LUsN5zS1sKFolt8bwAZrCa
GqNYJAgFDEpQAAAzmxafI4vZWwAAAABJRU5ErkJggg==" />

In our experience, with Apache falling over on request of favicon.ico, we commented out extra headers in the .htaccess file.
For example we had
Header set X-XSS-Protection "1; mode=block"
... but we had forgotten to sudo a2enmod headers beforehand. Commenting out extra headers being sent resolved our favicon.ico issue.
We also had several virtual hosts set up for development, and only failed out with 500 Internal Server Error when using http://localhost and fetching /favicon.ico. If you run "curl -v http://localhost/favicon.ico" and get a warning about the host name not being in the resolver cache or something to that effect, you might experience problems.
It could be as simple as not fetching (we tried that and it didn't work, because our root cause was different) or look around for directives in apache2.conf or .htaccess which might be causing strange 500 Internal Server Error messages.
We found it failed so quickly there was nothing useful in Apache's error logs whatsoever and spent an entire morning changing small things here and there until we resolved the problem of setting extra headers when we had forgotten to have mod_headers loaded!

Sometimes this error comes, when HTML has some commented code and browser is trying to look for something. Like in my case I had commented code for a web form in flask and I was getting this.
After spending 2 hours I fixed it in the following ways:
1) I created a new python environment and then it threw an error on the commented HTML line, before this I was only thrown error 'GET /favicon.ico HTTP/1.1" 404'
2) Sometimes, when I had a duplicate code, like python file existing with the same name, then also I saw this error, try removing those too

If you are not using HTML and it's auto-generated by Flask or some frameworks you can always add a dummy route in the app to just return dummy text to fix this issue.
Or
.
.
.
you can just add the favicon :)
Eg for Python Flask Application.
#app.route('/favicon.ico')
def favicon():
return 'dummy', 200

I solved this problem by using the Content-Security-Policy HTTP response header. By using this, is possible to block the browser from making further media queries like images (other types are also possible). I added the following header to the response:
Content-Security-Policy: img-src 'none'
The problem is it will block ALL image queries. If your HTML has any image, they won't be loaded. In my case it was very likely a bug in Firefox because the browser was requesting the favicon.ico for a response whose Content-type is text/xml!
It also depends on the browser implementing this feature as is enforced on the client side.
Check https://content-security-policy.com for a complete guide on CSP.
Cheers!

You could use
<link rel="shortcut icon" href="http://localhost/" />
That way it won't actually be requested from the server.

Related

Favicon is not visible

I can not see the favicon in any browser at https://www.example.com
I tried the following implementations:
<link rel="shortcut icon" type="image/x-icon" href="/favicon.ico">
and
<link rel="icon" type="image/x-icon" href="/favicon.ico">
The favicon is in root-folder.
I also checked the Chrome Debugger and it gives me a "200 status ok" for favicon.ico
I tried checking it from multiple places (Work, Home, Smartphone...) - can't see it anywhere.
I also used one of these SEO-Checking sites to check if there is a favicon.
They tell me, that there is one but when they try to show me the favicon they found, it is simply not visible - there is a browser "image not found" symbol instead.
I started implementation more than 24h ago, not sure if it's a a caching problem.
The favicon is in root-folder.
Any ideas?
If you look well, you are getting a 200 status ok on the favicon, but the type is set to html, and not img.
also if you visit https://www.appfelsine.com/favicon.ico, you get an html page and not the icon image.
Might have something to do with your .htaccess perhaps? Try storing the favicon in your images folder and change the path, see if it works
Your 404 page returns a 200 (instead of a 404), so all those SEO sites see the 200 success and assume the favicon is loaded. Your favicon is not in the correct directory. To confirm this you can open up the page source and click the link to the favicon url and see that it's returning a 404 page. I would recommending putting the favicon in the same folder as other images and linking to it there.
Might be that the image is corrupted. My suggestio is to convert it to PNG. Really likely you would have a performance gain that makes this solution worth it.
To convert it:
http://image.online-convert.com/convert-to-png
Use then:
<link rel="icon"
type="image/png"
href="/favicon.png">

Facebook Page Tab external URL

What should contain a HTML file to display it on a Facebook Page's Tab. I have added the tab, but I get blank page, I have tried everything, searched hours, my URL is HTTPS, so I really don't know...
Could you paste anybody a full HTML code that works, appears in a Page's Tab with an app? Thanks!
That would be the minimum, it definitely works:
<!DOCTYPE html>
<html>
<head>
</head>
<body>
hello there
</body>
</html>
As you can see, nothing fancy there. Without any test URL, we can only tell you to open the browser console and search for errors.
We found the answer, the server doesn't enable the iframing, so what I tried was good, but
Console's JavaScript error: Load denied by X-Frame-Options: https://www.... does not permit cross-origin framing.
So we contacted to our hosting company to solve this.
UPDATE: Hosting support added the following code to my .htaccess file: Header always unset X-Frame-Options, what solved my problem.

Site not valid - but it is

So, I'm building a website called "dagbok.nu", which is swedish for "diary now" :)
Anyway, when creating the Facebook application, it claims that the site URL is invalid as well as the app domain. For site url, I used "http://dagbok.nu" and for site domain, I used "dagbok.nu". Please don't reply (as I've seen others do on similar issues) that I should type the site url with the scheme and the domain without - that's exactly what I'm doing.
Right, so according to another question here, one could trouble shoot this functionality using FB's own URL scraper, so I did just that:
http://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fdagbok.nu
And the reply: Error Parsing URL: Error parsing input URL, no data was scraped
Right, so now I can assume that the reason for it being considered invalid is because of FB not being able to scrape the URL. But why?
According to this question, one of the reasons seems to be that FB has deemed the URL insecure or "spammy". I've acquired this domain from a previous owner so this wasn't all that impossible. But when doing the same thing as Matthew in that post - i.e. trying to post in my timeline using the domain "http://dagbok.nu", I didn't get any information. The status box expanded as if to include a thumbnail and information about the link, but it only contained a "(No title)" text and nothing more.
So now I don't know what to do. I've tried to check the DIG and NS records from multiple servers around the web, and everyone seems to resolve it correctly, and I've had friends double check the URL from the states as well. I can't understand what's wrong and I have no idea how to ask someone at FB how to resolve this. Does anyone here have a good advice for this? Thanks in advance! :)
EDIT
When changing the domain to another domain that points to the exact same web server and document_root, it works! So this is definitely a problem with the domain "dagbok.nu" and not with the code on that page.
EDIT
When using the debug function above - I see no activity in the server log what so ever. Facebook doesn't even contact the server. When using the alternate url - the one from the last edit, it pops up in the logs as it should.
EDIT
I filed a bug report with Facebook, And their first response was that they were going to follow up. Now, a month later, I got an email that said "We are prioritizing bugs based on impact to the developer community. As this bug report has not received much attention from other developers, we are closing it so as to better focus on the top issues", and then they told me to go here to stackoverflow to try to solve my issue - but the issue is WITH THEM, and of course no one else have reported that my site doesn't work, it affects only me, and I haven't opened it yet due to this bug!
EDIT
I wanted to file a new bug report, but I can't even that now, since they are blocking bug reports with this URL as well!
I had to edit the URL - here is the new bug report
When Facebook tries to scrap your site for information, they send a call to your server with specific user agent called "facebookexternalhit"...
Facebook needs to scrape your page to know how to display it around
the site.
Facebook scrapes your page every 24 hours to ensure the properties are
up to date. The page is also scraped when an admin for the Open Graph
page clicks the Like button and when the URL is entered into the
Facebook URL Linter. Facebook observes cache headers on your URLs -
it will look at "Expires" and "Cache-Control" in order of preference.
However, even if you specify a longer time, Facebook will scrape your
page every 24 hours.
The user agent of the scraper is: "facebookexternalhit/1.1(+http://www.facebook.com/externalhit_uatext.php)"
Make sure it is not blocked by your server firewall
Look in your server log if it even tried to access your site
If you think this is a firewall issue look at this link
Your problem appears to be with your character encoding string. Your Apache server is currently sending the unsupported string latin1. You've defined your meta:content-type as iso-8859-1. See the w3c validator
From what I've seen, the Facebook parser will stop immediately if it encounters either an unrecognized character encoding string or a mismatch in character encoding strings between your header and meta tags.
The problem could be originating from either your httpd.conf or php.ini files. Change these to match your meta and restart Apache. Since the problem seems to be domain-specific, I'd check httpd.conf first.
Could your domain be blacklisted? Could you try messaging your url to someone, and see if Facebook gives you a "This message contains blocked content..." error?
For example:
If you don't provide certain minimum Facebook markup on your page, it will respond with "Error Parsing URL: Error parsing input URL, no data was scraped." I only looked at the homepage, but it appears that dagbok.nu contains no Facebook markup. I'm not sure what things must be present at minimum, but in my implementation, I assume the fb:app_id meta tag and the JavaScript SDK script must be there. You may want to take a look at http://developers.facebook.com/docs/guides/web/#plugins , particularly the Authentication section.
I discovered your question because I had this same error today for an unknown reason. I found that it was caused because the content of my og:image meta tag used an incorrect URL to the image I was trying to use. So as you add Facebook markup to your page, make sure your values are correct or you may continue to receive this message.
This doesn't seem to be a Facebook problem if you take a look at what I've discovered.
The results when testing it with W3C Online Validation Tool are 1 of 2 results.
Tested using: dagbok.nu but note http://dagbok.nu has no difference in test results. Remove the last forward slash in between tests.
Test: 1
Results: 72 Errors 0 Warning
Note: Shown here is a fragment of the source Frameset DOCTYPE webpage.
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Frameset//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-frameset.dtd">
<NOSCRIPT><IMG SRC="http://svs.bystorm.se/rv?java=off"></NOSCRIPT><SCRIPT SRC="http://svs.bystorm.se/rvj"></SCRIPT>
<HTML STYLE="height:100%;">
<HEAD>
<META HTTP-EQUIV="content-type" CONTENT="text/html;charset=iso-8859-1">
Test: 2
Results: 4 Errors 1 Warning
Note: Shown here is a fragment of the source Transitional DOCTYPE webpage.
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html >
<head>
<title>Dagbok: Framsida</title>
<meta http-equiv="content-type" content="text/html; charset=iso-8859-1">
<meta name="author" content="Jonas Eklundh Communication (http://jonas.eklundh.com)">
<meta name="author-email" content="jonas#eklundh.com">
<meta name="copyright" content="Jonas Eklundh Communication #2012">
<meta name="keywords" content="Atlas,Innehållssystem,Jonas Eklundh">
<meta name="description" content="">
<meta name="creation-time" content="0,079s">
<meta name="kort" content="DGB">
Repeated tests loop these results when done a couple seconds apart indicating a page-redirect is occurring.
Security warnings are seen in Firefox and Chrome when visiting your site using these secure URL's:
https://dagbok.nu
https://www.dagbok.nu
The browser indicates the site should not be trusted because it's impersonating another site using invalid security certificate from *.loopiasecure.com
Recommendation: Check your .htaccess file, CMS Settings, page redirection, and security settings. Use the above source webpages to realize those file-locations / file-names that are being served to discover what's set incorrectly.
Once that's done, I think Facebook will be happy to then debug your webpage and provide additional recommendations.
Had the same problem and I discovered it was an incorrect IPv6 address in the AAAA records for my domain. The IPv4 record was correct, so the site worked in a browser but FB obviously check the IPv6 records!
This issue may also happen when Cloudflare is used. This is because Cloudflare protects the page from Facebook, which is then unable to collect the data, which in turn makes Facebook think the page is invalid.
My fix was:
Turn off Cloudflare for the page.
Scrape the page using Facebook's Dev Tools: https://developers.facebook.com/tools/debug/og/object
Click and let run the "Fetch new scrape information" button.
Re-enable cloudflare protection for the page.
You should then be able to continue to add the page where you needed.

FB OpenGraph og:image not pulling images (possibly https?)

Facebook cannot grasp my og:image files and I have tried every usual solution. I'm beginning to think it might have something to do with https://...
I have checked http://developers.facebook.com/tools/debug and have zero warnings or errors.
It is finding the images we linked to in the "og:image", but they're showing up blank. When we click the image(s), however, they DO exist and it takes is straight to them.
It DOES show one image -- an image hosted on a non-https server.
We've tried square images, jpegs, pngs, larger sizes and smaller sizes. We've put the images right in public_html. Zero are showing up.
It's not a caching error, because when we add another og:image to the meta, FB's linter does find and read that. It DOES show a preview. The preview is blank. The only exception we're getting is for images that are not on this website.
We thought maybe there was some anti-leach on cpanel or the .htaccess that was preventing the images from showing up, so we checked. There was not. We even did a quick < img src="[remote file]" > on an entirely different server and the image shows up fine.
We thought maybe it was the og:type or another oddity with another meta tag. We removed all of them, one at a time and checked it. No change. Just warnings.
The same code on a different website shows up without any issue.
We thought maybe it was not pulling images because we're using the same product page(s) for multiple products (changing it based on the get value, ie, "details.php?id=xxx") but it's still pulling in one image (from a different url).
Leaving any og:image or image_src off, FB does not find any images.
I am at the end of my rope. If I said how much time myself and others have spent on this, you'd be shocked. The issue is that this is an online store. We absolutely, positively cannot NOT have images. We have to. We have ten or so other sites... This is the only one with og:image problems. It's also the only one on https, so we thought maybe that was the problem. But we can't find any precedent anywhere on the web for that.
These are the meta-tags:
<meta property="og:title" content="[The product name]" />
<meta property="og:description" content="[the product description]" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-art-black.png" />
<meta property="og:image" content="http://www.[ADIFFERENTwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/ARShopHeader.png" />
<meta property="og:image" content="http://www.[ourwebsite].com/overdriven-blues-music-tshirt-art-black.JPG" />
<meta property="og:type" content="product"/>
<meta property="og:url" content="https://www.[ourwebsite].com/apparel-details.php?i=10047" />
<meta property="og:site_name" content="[our site name]" />
<meta property="fb:admins" content="[FB-USER-ID-NUMBER]"/>
<meta name="title" content="[The product name]" />
<meta name="description" content="[The product description]" />
<link rel="image_src" href="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta name="keywords" content="[four typical keywords]">
<meta name="robots" content="noarchive">
In case you want it, here's a link to one of our product pages that we've been working on. [Link shortened to try to curb this getting into search results for our site]: http://rockn.ro/114
EDIT ----
Using the "see what facebook sees" scraper tool, we were able to see the following:
"image": [
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-details-safari.png"
},
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-art-safari.png"
},
{
"url": "http://www.[theotherNONSECUREwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png"
}
],
We tested all links it found for a single page. All were perfectly valid images.
EDIT 2 ----
We tried a test and added a subdomain to the NONSECURE website (from which images are actually visible through facebook). Subdomain was http://img.[nonsecuresite].com. We then put all images into the main subdomain folder and referenced those. It would not pull those images into FB. However, it would still pull any images that were referenced on the nonsecure main domain.
POSTED WORKAROUND ----
Thanks to Keegan, we now know that this is a bug in Facebook. To workaround, we placed a subdomain in a different NON-HTTPS website and dumped all images in it. We referenced the coordinating http://img.otherdomain.com/[like-image.jpg] image in og:image on each product page. We then had to go through FB Linter and run EVERY link to refresh the OG data. This worked, but the solution is a band-aid workaround, and if the https issue is fixed and we go back to using the natural https domain, FB will have cached the images from a different website, complicating matters. Hopefully this information helps to save someone else from losing 32 coding hours of their life.
Some properties can have extra metadata attached to them. These are specified in the same way as other metadata with property and content, but the property will have extra :
The og:image property has some optional structured properties:
og:image:url - Identical to og:image.
og:image:secure_url - An
alternate url to use if the webpage requires HTTPS.
og:image:type - A
MIME type for this image.
og:image:width - The number of pixels wide.
og:image:height - The number of pixels high.
A full image example:
<meta property="og:image" content="http://example.com/ogp.jpg" />
<meta property="og:image:secure_url" content="https://secure.example.com/ogp.jpg" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
So you need to change og:image property for your HTTPS URLs to og:image:secure_url
Ex:
HTTPS META TAG FOR IMAGE:
<meta property="og:image:secure_url" content="https://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
HTTP META TAG FOR IMAGE:
<meta property="og:image" content="http://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
Source: http://ogp.me/#structured <-- You can visit this site for more information.
EDIT: Don't forget to ping facebook servers after updating your code - URL Linter
I ran into the same problem and reported it as a bug on the Facebook developer site. It seems pretty clear that og:image URIs using HTTP work just fine and URIs using HTTPS do not. They have now acknowledged that they are "looking into this."
Update: As of 2020, the bug is no longer visible in Facebook's ticket system. They never responded and I don't believe this behavior has changed. However, specifying HTTPS URI in og:image:secure does seem to be working fine.
I don't know, if it's only with me but for me og:image does not work and it picks my site logo, even though facebook debugger shows the correct image.
But changing og:image to og:image:url worked for me. Hope this helps anybody else facing similar issue.
tl;dr – be patient
I ended up here because I was seeing blank images served from a https site. The problem was quite a different one though:
When content is shared for the first time, the Facebook crawler will scrape and cache the metadata from the URL shared. The crawler has to see an image at least once before it can be rendered. This means that the first person who shares a piece of content won't see a rendered image
[https://developers.facebook.com/docs/sharing/best-practices/#precaching]
While testing, it took facebook around 10 minutes to finally show the rendered image. So while I was scratching my head and throwing random og tags at facebook (and suspecting the https problem mentioned here), all I had to do was wait.
As this might really stop people from sharing your links for the first time, FB suggests two ways to circumvent this behavior:
a) running the OG Debugger on all your links: the image will be cached and ready for sharing after ~10 minutes or b) specifying og:image:width and og:image:height. (Read more in the above link)
Still wondering though what takes them so long ...
Got here from Google but this wasn't much help for me. It turned out that there is a minimum aspect ratio of 3:1 required for the logo. Mine was almost 4:1. I used Gimp to crop it to exactly 3:1 and voila - my logo is now shown on FB.
I had similar problems. I removed the property="og:image:secure_url" and now it will scrub with just og:image. Sometimes, less is more
I had the same error and nothing of previous have helped, so I tried to follow original documentation of Open Graph Protocol and I added prefix attribute to my html tag and everything became awesome.
<html prefix="og: http://ogp.me/ns#">
As I accidentally found, transparent blank image comes with response header indicating possible cause of the problem.
Go to the debugger at https://developers.facebook.com/tools/debug/og/object/
Put your URL
In the bottom, facebook shows your "image" (transparent 1x1 GIF)
Image is linked to your original image - no point pressing it
Press right and view image (you'll get something like https://external-ams3-1.xx.fbcdn.net/safe_image.php?d=...&url=...)
Turn on Net tab on firebug/developer tools, refresh page if needed
You'll get x-error-detail response header with explanation
For example, in my case it was Invalid image extension for URL: https://[mydomain]/[myfilename].jpg
The real issue in my case was related to prerender.io.
As it turns out, if image is requested via prerender, it's converted to HTML. Something like this:
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head>
<body style="margin: 0px;"><img style="-webkit-user-select: none; cursor: -webkit-zoom-in; " src="https://[yourdomain].com/[yourfilename].jpg" width="1078" height="718"></body>
</html>
It's either bug in prerender itself, or it's supposed to be configured in your proxy to not use prerender for *.jpg requests (even if they are requested by Facebook bot).
It's really hard to notice this, as prerender is used only on certain user-agent headers.
In my case the problem was in not providing CA Root Certificate. I figured it out after using: https://www.ssllabs.com/ssltest/analyze.html to analyze SSL configuration.
I took http:// out of my og:image and replaced it with just plain old www. then it started working fine.
You can use this tool, by Facebook to reset your image scrape cache and test what URL it is pulling for the demo image.
I have a Wordpress site that uses og:image with an https URL to the image and the images show up just fine in Facebook preview links.
I have another site I was working on that uses og:image with an https URL and sometimes the images would appear and sometimes they wouldn't. I tried the suggestions on this page, using og:image:url and og:image:secure_url and neither one made any difference, the image wouldn't be used for the preview.
Both sites have valid https certificates, so it wasn't a certificate problem.
After searching some more I found out that Facebook has a MINIMUM SIZE for images. If the og:image is less than 200x200px it will not be used by Facebook. The recommended size is 600x600px for stories and 1200x630px for everything else.
I scaled up the image sizes on my second site and they started appearing on Facebook. Mystery solved.
Hope you find this useful.
I discovered another scenario that can cause this issue. I went through all the steps described in the question and the answers, still the problem remained.
I checked my images and found that some of my posts had way too large thumbnail images in og:image in the range of several thousand pixels and several megabytes.
This happened due to the recent migration from WP to Jekyll, I optimized my images with gulp, but used the original images in og:image by mistake.
Facebook gives us the following recommendations as of today:
Use images that are at least 1200 x 630 pixels for the best display on
high resolution devices. At the minimum, you should use images that
are 600 x 315 pixels to display link page posts with larger images.
Images can be up to 8MB in size.
So there is an upper limit of 8MB.
I ran into the same issue and then I noticed that I had a different domain for the og:url
Once I made sure that the domain was the same for og:url and og:image it worked.
Hope this helps.
Similar symptoms (Facebook et al not correctly fetching og:image and other assets over https) can occur when the site's https certificate is not fully compliant.
Your site's https cert may seem valid (green key in the browser and all), but it will not scrape correctly if it's missing an intermediate or chain certificate. This can lead to many wasted hours checking and rechecking all the various caches and meta tags.
Might not have been your problem, but could be other's with similar symptoms (like mine). There's many ways to check your cert - the one I happened to use: https://www.sslshopper.com/ssl-checker.html
I can see that the Debugger is retrieving 4 og:image tags from your URL.
The first image is the largest and therefore takes longest to load.
Try shrink that first image down or change the order to show a smaller image first.
In addition, this problem also occurs when you add a user generated story (where you do not use og:image). For example:
POST /me/cookbook:eat?
recipe=http://www.example.com/recipes/pizza/&
image[0][url]=http://www.example.com/recipes/pizza/pizza.jpg&
image[0][user_generated]=true&
access_token=VALID_ACCESS_TOKEN
The above will only work with http and not with https. If you use https, you will get an error that says:
Attached image () failed to upload
Don't forget to refresh servers through :
Facebook Debugger
And click on "Collect new info"
Had a similar problem today, which the Sharing Debugger helped me solve. It seems that Facebook can’t (currently) understand images with XMP metadata embedded. When I replaced the images on our articles with versions without XMP metadata, and re-scraped the page (using the Sharing Debugger), the problem went away. A hex editor will help you see whether or not your image contains XMP metadata.
OK... I realize this thread is old and overcrowded, but in case someone comes in like I did struggling to get their og:image tag to work right in Facebook, here's the trick that worked for me:
do NOT use this link:
https://developers.facebook.com/tools/debug/sharing/?q=https%3A%2F%2Fwww.google.com
to work through your problem. Or if you do, immediately scroll down to the bottom and click on Scrape VIA API.
https://developers.facebook.com/tools/explorer/?method=POST&path=%3Fscrape%3Dtrue%26id%3Dhttps%3A%2F%2Fwww.google.com&version=v5.0
There are errors displayed in the explorer tool that are NOT shown in the "debug" tool. Maddening!!! (in my case, a space in the image filename knocked my image out silently in the debug tool, but it showed the error in the explorer tool).
I came across another reason for og images not to display on FB cards. Furthermore, using the FB scraper tool to debug the og meta tags, I could confirm all the required tags where present in my WordPress page, and yet I would get the following file download error,
Provided og:image, < https-link-to-jpg-image > could not be downloaded.
This can happen due to several different reasons such as your server
using unsupported content-encoding. The crawler accepts deflate and gzip content encodings.
I had a vague feeling that the image format had an issue, the link to the image was working but the message seems to indicate something amiss with the content-encoding.
After much searching, I ended up looking at the php extensions that are required for a WordPress server, and realised the pho-exif module was not installed. The exif module writes exif metadata to all uploaded images. As a result the images used in the FB og image tag did not have any exif metadata associated with them.
Once the exif module is enabled, WordPress allows exif metadata to be reset for an image (Media library->select and image->Edit more details->Map exif metadata) and the image now appeared on the FB card as expected.
In my case, it seems that the crawler is just having a bug. I've tried:
Changing links to http only
Removing end white space
Switching back to http completely
Reinstalling the website
Installing a bunch of OG plugins (I use WordPress)
Suspecting the server has a weird misconfiguration that blocks the bots (because all the OG checkers are unable to fetch tags, and other requests to my sites are unstable)
None of these works. This costed me a week. And suddenly out of nowhere it seems to work again.
Here are my research, if someone meets this problem again:
What makes Open Graph checkers unable to detect Open Graph data?
How to know what bots of a website, if I have no root access to the hosting they will read?
👍 What makes Open Graph checkers unable to detect Open Graph data? - Let's Encrypt Community Support
👍 Crawler is unable to fetch images, but adding a brand new, unique query string can make it work for one first time - Facebook for Developers
Also, there are more checkers other than the Facebook's Object Debugger for you to check: OpenGraphCheck.com, Abhinay Rathore's Open Graph Tester, Iframely's Embed Codes, Card Validator | Twitter Developers.
I arrived here because an updated facebook meta tag image wasn't showing on facebook shares.
For anyone else in this predicament, the reason was simply that you need to ask facebook to scrape your site again.
Once you do that, it will appear as expected.
I'm using cloudfront distributions pointing to s3 bucket to serve static images...my cloudfront origins are set to redirect http to https...so maybe that has something to do with it?
regardless...
Updating og:image from https to http resolved the issue for me, images are now being posted to facebook posts with links to my site.
UPDATE: the above behavior continued to happen...anytime I were to change the og:image url, or invalidate my cloudFront cache, the image would work on the FB debugger, but the image would never show up on FB.
I added a new behavior for my og:image endpoint and set min ttl, max ttl, and default ttl to 0. And now everything is working great...not ideal as I'd prefer it to be cached, but apparently FB can't handle the cloudfront 304 response?
I had the same issue and the cause was the minimum TLS version specified in Cloudflare:
If I set minimum TLS to 1.3 - no meta images. If I set it to 1.2 or lower - meta images appear.
It seems that social media previews don't support TLS 1.3, hence the issue. For the record, I have no og:image:secure_url and have HTTP redirected to HTTPS. The site is completely not accessible via HTTP. Only the TLS version was causing trouble.
I struggled to find an answer to this and was getting this puzzling error from LinkedIn:
We encountered an SSL connection error while trying to access the URL.
Please check that the site is using a prime size that is compatible
with Java 8, or contact Support with the content's URL.
The answer was, even though I had both TLSv1.2 and TLSv1.3 enabled in nginx, TLSv1.2 wasn't available due to my cipher list as verified by this checker. It appears facebook and linkedin both use TLSv1.2 to generate previews (as of Nov 2022).
I had to update nginx to the following according to the first answer on this post:
ssl_protocols TLSv1.3 TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers "EECDH+AESGCM,EDH+AESGCM";
If your image links look like this:
"https://someurl Wed Sep 14 2022 05:59:25 GMT+0000 (Coordinated Universal Time).jpg"
Then make sure that you are using encodeURI function (JavaScript) or any similar function in other languages while setting the URL.
This will help you to create a valid URL which can be understood by og:image.
{
'og:title': "title",
'og:description': "description",
'og:image': encodeURI(image),
'og:image:secure_url': encodeURI(image),
}
From what I observed, I see that when your website is public and even though the image url is https, it just works fine.
For me this worked:
<meta property="og:url" content="http://yoursiteurl" />
<meta property="og:image" content="link_to_first_image_if_you_want" />
<meta property="og:image" content="link_to_second_image_if_you_want" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
<meta property="og:title" content="your title" />
<meta property="og:description" content="your text about homepage"/>
After several hours of testing and trying things...
I solved this problem as simple as possible.
I notice that they use "test pages" inside Facebook Developers Page that contains only the "og" tags and some text in the body tag that referals this og tags.
So what have i done?
I created a second view in my application, containing this same things they use.
And how i know is Facebook that is accessing my page so i can change the view? They have a unique User Agent: "facebookexternalhit/1.1"
Once you update the meta tag make sure the content(image) link is absolute path and
go here https://developers.facebook.com/tools/debug/sharing enter you site link and click on scrape again in next page

facebook application FBML: error code 405 when changing app from Iframe to FBML

I'm trying to write a simple facebook application using FBML.
when I configure my application to work as an IFrame and I view the source
i see the following:
<html>
<head>
</head>
<body>
<fb:swf
swfbgcolor="000000"
imgstyle="border-width:3px; border-color:white;"
swfsrc='http://url/file.swf'
width='340' height='270' />
</body>
</html>
When I change my application to be an FBML application i get the following error:
Application Temporarily Unavailable
Received HTTP error code 405 while loading http://xpofb.xpogames.com:5080/xpogame- servlet/Canvas?
Sorry, the application you were using is experiencing a problem. Please try again later.
Any ideas?
Try to put just plain text to FBML so you can be sure that it is not some FBML tag that is causing problem.
Facebook may act funny if your Canvas URL doesn't end with a slash. Try to map your controller to a directory and try again.
Also FBML apps don't support html, head, body tags, so that would be your next error message.
Welp... after a lot of research on the internet i read that some people added rewrite rules to /foo will be /foo.html and that would work.
When I tried to add rewrite rules in my case it didn't resolve the issue.
moving from a servlet to a jsp page did resolve the issue.
the Servlet was at url/Canvas
rewriting it to url/Canvas.html did not resolve the issue
creating a new jsp file at url/canvas.jsp resolved the issue.