Facebook cannot grasp my og:image files and I have tried every usual solution. I'm beginning to think it might have something to do with https://...
I have checked http://developers.facebook.com/tools/debug and have zero warnings or errors.
It is finding the images we linked to in the "og:image", but they're showing up blank. When we click the image(s), however, they DO exist and it takes is straight to them.
It DOES show one image -- an image hosted on a non-https server.
We've tried square images, jpegs, pngs, larger sizes and smaller sizes. We've put the images right in public_html. Zero are showing up.
It's not a caching error, because when we add another og:image to the meta, FB's linter does find and read that. It DOES show a preview. The preview is blank. The only exception we're getting is for images that are not on this website.
We thought maybe there was some anti-leach on cpanel or the .htaccess that was preventing the images from showing up, so we checked. There was not. We even did a quick < img src="[remote file]" > on an entirely different server and the image shows up fine.
We thought maybe it was the og:type or another oddity with another meta tag. We removed all of them, one at a time and checked it. No change. Just warnings.
The same code on a different website shows up without any issue.
We thought maybe it was not pulling images because we're using the same product page(s) for multiple products (changing it based on the get value, ie, "details.php?id=xxx") but it's still pulling in one image (from a different url).
Leaving any og:image or image_src off, FB does not find any images.
I am at the end of my rope. If I said how much time myself and others have spent on this, you'd be shocked. The issue is that this is an online store. We absolutely, positively cannot NOT have images. We have to. We have ten or so other sites... This is the only one with og:image problems. It's also the only one on https, so we thought maybe that was the problem. But we can't find any precedent anywhere on the web for that.
These are the meta-tags:
<meta property="og:title" content="[The product name]" />
<meta property="og:description" content="[the product description]" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-art-black.png" />
<meta property="og:image" content="http://www.[ADIFFERENTwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/ARShopHeader.png" />
<meta property="og:image" content="http://www.[ourwebsite].com/overdriven-blues-music-tshirt-art-black.JPG" />
<meta property="og:type" content="product"/>
<meta property="og:url" content="https://www.[ourwebsite].com/apparel-details.php?i=10047" />
<meta property="og:site_name" content="[our site name]" />
<meta property="fb:admins" content="[FB-USER-ID-NUMBER]"/>
<meta name="title" content="[The product name]" />
<meta name="description" content="[The product description]" />
<link rel="image_src" href="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta name="keywords" content="[four typical keywords]">
<meta name="robots" content="noarchive">
In case you want it, here's a link to one of our product pages that we've been working on. [Link shortened to try to curb this getting into search results for our site]: http://rockn.ro/114
EDIT ----
Using the "see what facebook sees" scraper tool, we were able to see the following:
"image": [
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-details-safari.png"
},
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-art-safari.png"
},
{
"url": "http://www.[theotherNONSECUREwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png"
}
],
We tested all links it found for a single page. All were perfectly valid images.
EDIT 2 ----
We tried a test and added a subdomain to the NONSECURE website (from which images are actually visible through facebook). Subdomain was http://img.[nonsecuresite].com. We then put all images into the main subdomain folder and referenced those. It would not pull those images into FB. However, it would still pull any images that were referenced on the nonsecure main domain.
POSTED WORKAROUND ----
Thanks to Keegan, we now know that this is a bug in Facebook. To workaround, we placed a subdomain in a different NON-HTTPS website and dumped all images in it. We referenced the coordinating http://img.otherdomain.com/[like-image.jpg] image in og:image on each product page. We then had to go through FB Linter and run EVERY link to refresh the OG data. This worked, but the solution is a band-aid workaround, and if the https issue is fixed and we go back to using the natural https domain, FB will have cached the images from a different website, complicating matters. Hopefully this information helps to save someone else from losing 32 coding hours of their life.
Some properties can have extra metadata attached to them. These are specified in the same way as other metadata with property and content, but the property will have extra :
The og:image property has some optional structured properties:
og:image:url - Identical to og:image.
og:image:secure_url - An
alternate url to use if the webpage requires HTTPS.
og:image:type - A
MIME type for this image.
og:image:width - The number of pixels wide.
og:image:height - The number of pixels high.
A full image example:
<meta property="og:image" content="http://example.com/ogp.jpg" />
<meta property="og:image:secure_url" content="https://secure.example.com/ogp.jpg" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
So you need to change og:image property for your HTTPS URLs to og:image:secure_url
Ex:
HTTPS META TAG FOR IMAGE:
<meta property="og:image:secure_url" content="https://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
HTTP META TAG FOR IMAGE:
<meta property="og:image" content="http://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
Source: http://ogp.me/#structured <-- You can visit this site for more information.
EDIT: Don't forget to ping facebook servers after updating your code - URL Linter
I ran into the same problem and reported it as a bug on the Facebook developer site. It seems pretty clear that og:image URIs using HTTP work just fine and URIs using HTTPS do not. They have now acknowledged that they are "looking into this."
Update: As of 2020, the bug is no longer visible in Facebook's ticket system. They never responded and I don't believe this behavior has changed. However, specifying HTTPS URI in og:image:secure does seem to be working fine.
I don't know, if it's only with me but for me og:image does not work and it picks my site logo, even though facebook debugger shows the correct image.
But changing og:image to og:image:url worked for me. Hope this helps anybody else facing similar issue.
tl;dr – be patient
I ended up here because I was seeing blank images served from a https site. The problem was quite a different one though:
When content is shared for the first time, the Facebook crawler will scrape and cache the metadata from the URL shared. The crawler has to see an image at least once before it can be rendered. This means that the first person who shares a piece of content won't see a rendered image
[https://developers.facebook.com/docs/sharing/best-practices/#precaching]
While testing, it took facebook around 10 minutes to finally show the rendered image. So while I was scratching my head and throwing random og tags at facebook (and suspecting the https problem mentioned here), all I had to do was wait.
As this might really stop people from sharing your links for the first time, FB suggests two ways to circumvent this behavior:
a) running the OG Debugger on all your links: the image will be cached and ready for sharing after ~10 minutes or b) specifying og:image:width and og:image:height. (Read more in the above link)
Still wondering though what takes them so long ...
Got here from Google but this wasn't much help for me. It turned out that there is a minimum aspect ratio of 3:1 required for the logo. Mine was almost 4:1. I used Gimp to crop it to exactly 3:1 and voila - my logo is now shown on FB.
I had similar problems. I removed the property="og:image:secure_url" and now it will scrub with just og:image. Sometimes, less is more
I had the same error and nothing of previous have helped, so I tried to follow original documentation of Open Graph Protocol and I added prefix attribute to my html tag and everything became awesome.
<html prefix="og: http://ogp.me/ns#">
As I accidentally found, transparent blank image comes with response header indicating possible cause of the problem.
Go to the debugger at https://developers.facebook.com/tools/debug/og/object/
Put your URL
In the bottom, facebook shows your "image" (transparent 1x1 GIF)
Image is linked to your original image - no point pressing it
Press right and view image (you'll get something like https://external-ams3-1.xx.fbcdn.net/safe_image.php?d=...&url=...)
Turn on Net tab on firebug/developer tools, refresh page if needed
You'll get x-error-detail response header with explanation
For example, in my case it was Invalid image extension for URL: https://[mydomain]/[myfilename].jpg
The real issue in my case was related to prerender.io.
As it turns out, if image is requested via prerender, it's converted to HTML. Something like this:
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head>
<body style="margin: 0px;"><img style="-webkit-user-select: none; cursor: -webkit-zoom-in; " src="https://[yourdomain].com/[yourfilename].jpg" width="1078" height="718"></body>
</html>
It's either bug in prerender itself, or it's supposed to be configured in your proxy to not use prerender for *.jpg requests (even if they are requested by Facebook bot).
It's really hard to notice this, as prerender is used only on certain user-agent headers.
In my case the problem was in not providing CA Root Certificate. I figured it out after using: https://www.ssllabs.com/ssltest/analyze.html to analyze SSL configuration.
I took http:// out of my og:image and replaced it with just plain old www. then it started working fine.
You can use this tool, by Facebook to reset your image scrape cache and test what URL it is pulling for the demo image.
I have a Wordpress site that uses og:image with an https URL to the image and the images show up just fine in Facebook preview links.
I have another site I was working on that uses og:image with an https URL and sometimes the images would appear and sometimes they wouldn't. I tried the suggestions on this page, using og:image:url and og:image:secure_url and neither one made any difference, the image wouldn't be used for the preview.
Both sites have valid https certificates, so it wasn't a certificate problem.
After searching some more I found out that Facebook has a MINIMUM SIZE for images. If the og:image is less than 200x200px it will not be used by Facebook. The recommended size is 600x600px for stories and 1200x630px for everything else.
I scaled up the image sizes on my second site and they started appearing on Facebook. Mystery solved.
Hope you find this useful.
I discovered another scenario that can cause this issue. I went through all the steps described in the question and the answers, still the problem remained.
I checked my images and found that some of my posts had way too large thumbnail images in og:image in the range of several thousand pixels and several megabytes.
This happened due to the recent migration from WP to Jekyll, I optimized my images with gulp, but used the original images in og:image by mistake.
Facebook gives us the following recommendations as of today:
Use images that are at least 1200 x 630 pixels for the best display on
high resolution devices. At the minimum, you should use images that
are 600 x 315 pixels to display link page posts with larger images.
Images can be up to 8MB in size.
So there is an upper limit of 8MB.
I ran into the same issue and then I noticed that I had a different domain for the og:url
Once I made sure that the domain was the same for og:url and og:image it worked.
Hope this helps.
Similar symptoms (Facebook et al not correctly fetching og:image and other assets over https) can occur when the site's https certificate is not fully compliant.
Your site's https cert may seem valid (green key in the browser and all), but it will not scrape correctly if it's missing an intermediate or chain certificate. This can lead to many wasted hours checking and rechecking all the various caches and meta tags.
Might not have been your problem, but could be other's with similar symptoms (like mine). There's many ways to check your cert - the one I happened to use: https://www.sslshopper.com/ssl-checker.html
I can see that the Debugger is retrieving 4 og:image tags from your URL.
The first image is the largest and therefore takes longest to load.
Try shrink that first image down or change the order to show a smaller image first.
In addition, this problem also occurs when you add a user generated story (where you do not use og:image). For example:
POST /me/cookbook:eat?
recipe=http://www.example.com/recipes/pizza/&
image[0][url]=http://www.example.com/recipes/pizza/pizza.jpg&
image[0][user_generated]=true&
access_token=VALID_ACCESS_TOKEN
The above will only work with http and not with https. If you use https, you will get an error that says:
Attached image () failed to upload
Don't forget to refresh servers through :
Facebook Debugger
And click on "Collect new info"
Had a similar problem today, which the Sharing Debugger helped me solve. It seems that Facebook can’t (currently) understand images with XMP metadata embedded. When I replaced the images on our articles with versions without XMP metadata, and re-scraped the page (using the Sharing Debugger), the problem went away. A hex editor will help you see whether or not your image contains XMP metadata.
OK... I realize this thread is old and overcrowded, but in case someone comes in like I did struggling to get their og:image tag to work right in Facebook, here's the trick that worked for me:
do NOT use this link:
https://developers.facebook.com/tools/debug/sharing/?q=https%3A%2F%2Fwww.google.com
to work through your problem. Or if you do, immediately scroll down to the bottom and click on Scrape VIA API.
https://developers.facebook.com/tools/explorer/?method=POST&path=%3Fscrape%3Dtrue%26id%3Dhttps%3A%2F%2Fwww.google.com&version=v5.0
There are errors displayed in the explorer tool that are NOT shown in the "debug" tool. Maddening!!! (in my case, a space in the image filename knocked my image out silently in the debug tool, but it showed the error in the explorer tool).
I came across another reason for og images not to display on FB cards. Furthermore, using the FB scraper tool to debug the og meta tags, I could confirm all the required tags where present in my WordPress page, and yet I would get the following file download error,
Provided og:image, < https-link-to-jpg-image > could not be downloaded.
This can happen due to several different reasons such as your server
using unsupported content-encoding. The crawler accepts deflate and gzip content encodings.
I had a vague feeling that the image format had an issue, the link to the image was working but the message seems to indicate something amiss with the content-encoding.
After much searching, I ended up looking at the php extensions that are required for a WordPress server, and realised the pho-exif module was not installed. The exif module writes exif metadata to all uploaded images. As a result the images used in the FB og image tag did not have any exif metadata associated with them.
Once the exif module is enabled, WordPress allows exif metadata to be reset for an image (Media library->select and image->Edit more details->Map exif metadata) and the image now appeared on the FB card as expected.
In my case, it seems that the crawler is just having a bug. I've tried:
Changing links to http only
Removing end white space
Switching back to http completely
Reinstalling the website
Installing a bunch of OG plugins (I use WordPress)
Suspecting the server has a weird misconfiguration that blocks the bots (because all the OG checkers are unable to fetch tags, and other requests to my sites are unstable)
None of these works. This costed me a week. And suddenly out of nowhere it seems to work again.
Here are my research, if someone meets this problem again:
What makes Open Graph checkers unable to detect Open Graph data?
How to know what bots of a website, if I have no root access to the hosting they will read?
👍 What makes Open Graph checkers unable to detect Open Graph data? - Let's Encrypt Community Support
👍 Crawler is unable to fetch images, but adding a brand new, unique query string can make it work for one first time - Facebook for Developers
Also, there are more checkers other than the Facebook's Object Debugger for you to check: OpenGraphCheck.com, Abhinay Rathore's Open Graph Tester, Iframely's Embed Codes, Card Validator | Twitter Developers.
I arrived here because an updated facebook meta tag image wasn't showing on facebook shares.
For anyone else in this predicament, the reason was simply that you need to ask facebook to scrape your site again.
Once you do that, it will appear as expected.
I'm using cloudfront distributions pointing to s3 bucket to serve static images...my cloudfront origins are set to redirect http to https...so maybe that has something to do with it?
regardless...
Updating og:image from https to http resolved the issue for me, images are now being posted to facebook posts with links to my site.
UPDATE: the above behavior continued to happen...anytime I were to change the og:image url, or invalidate my cloudFront cache, the image would work on the FB debugger, but the image would never show up on FB.
I added a new behavior for my og:image endpoint and set min ttl, max ttl, and default ttl to 0. And now everything is working great...not ideal as I'd prefer it to be cached, but apparently FB can't handle the cloudfront 304 response?
I had the same issue and the cause was the minimum TLS version specified in Cloudflare:
If I set minimum TLS to 1.3 - no meta images. If I set it to 1.2 or lower - meta images appear.
It seems that social media previews don't support TLS 1.3, hence the issue. For the record, I have no og:image:secure_url and have HTTP redirected to HTTPS. The site is completely not accessible via HTTP. Only the TLS version was causing trouble.
I struggled to find an answer to this and was getting this puzzling error from LinkedIn:
We encountered an SSL connection error while trying to access the URL.
Please check that the site is using a prime size that is compatible
with Java 8, or contact Support with the content's URL.
The answer was, even though I had both TLSv1.2 and TLSv1.3 enabled in nginx, TLSv1.2 wasn't available due to my cipher list as verified by this checker. It appears facebook and linkedin both use TLSv1.2 to generate previews (as of Nov 2022).
I had to update nginx to the following according to the first answer on this post:
ssl_protocols TLSv1.3 TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers "EECDH+AESGCM,EDH+AESGCM";
If your image links look like this:
"https://someurl Wed Sep 14 2022 05:59:25 GMT+0000 (Coordinated Universal Time).jpg"
Then make sure that you are using encodeURI function (JavaScript) or any similar function in other languages while setting the URL.
This will help you to create a valid URL which can be understood by og:image.
{
'og:title': "title",
'og:description': "description",
'og:image': encodeURI(image),
'og:image:secure_url': encodeURI(image),
}
From what I observed, I see that when your website is public and even though the image url is https, it just works fine.
For me this worked:
<meta property="og:url" content="http://yoursiteurl" />
<meta property="og:image" content="link_to_first_image_if_you_want" />
<meta property="og:image" content="link_to_second_image_if_you_want" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
<meta property="og:title" content="your title" />
<meta property="og:description" content="your text about homepage"/>
After several hours of testing and trying things...
I solved this problem as simple as possible.
I notice that they use "test pages" inside Facebook Developers Page that contains only the "og" tags and some text in the body tag that referals this og tags.
So what have i done?
I created a second view in my application, containing this same things they use.
And how i know is Facebook that is accessing my page so i can change the view? They have a unique User Agent: "facebookexternalhit/1.1"
Once you update the meta tag make sure the content(image) link is absolute path and
go here https://developers.facebook.com/tools/debug/sharing enter you site link and click on scrape again in next page
I am working on a project in which i have to search for terms on a search engine and then cluster the results on their contextual sense. So i have to treat each result as a document. unfortunately, the data present along with each result on the result page is too little for clustering. Hence, I wanted to know from where the search engines get the abstract for each result that they show. If i could get that entire abstract then i can cluster the results by treating them as separate documents.
From where does google get the abstract ?
For eg: If you search for "1000 Mile" on google, the second result shows the following abstract:
"The women's 1000 Mile Collection is based on classic designs and reflects Wolverine's long heritage of crafting quality footwear. Complementing these classics ..."
This abstract is not present in the Meta tags of the page.
From where does Google find this data.
Thanks
From Does Google use the Meta Description Tag for Description of Page?
Google will choose your search results snippets from the following places (not necessarily in this order):
The page's Meta Description tag
The page's Open Directory Project (ODP) Listing
Page content relevant to the search query
If you do not want Google to use the ODP listing's description then you can tell them not to do so with the following Meta tag:
<meta name="robots" content="NOODP">
If you want to encourage Google to use your Meta Description tag then make sure it is unique to each page. Also make sure it contains an accurate description of the page's content.
In thew absence of an ODP description and Meta Description tag, Google will use a portion of the page's text as the description. This text will contain the closest matches to the search query. I have not seen any official limit to how long this can be but a couple of sentences seems about right.
On a related note, if you don't want a snippet to be shown with a particular page you can use the following Meta tag to prevent one from being shown:
<meta name="robots" content="nosnippet">
See this blog post for Google's tips on using the meta description tag.
According to this site, "The meta description should typically be at most 145 to 150 characters in length as these are the maximum number of characters typically displayed at Yahoo! and Google, respectively."
That site is Flash-based, and Google can index Flash content, so given that the snippet isn't in the HTML source of the page as you point out, nor is it in the cached version of the page, I'm guessing that it's somewhere in the Flash movie.
It's kind of arbitrary that the snippet mentions 'The women's 1000 Mile Collection' while the site link itself is to the parent category of 1000 mile, not just women's, so I'm guessing here that gathering snippet-friendly metadata from a Flash site is an imprecise science. That's my best guess.
In this Google Webmaster blog post, they explain how they use external text or HTML files loaded into the Flash movie, and in one of the comments Jonathan Simon says (sorry):
"We try our best to crawl Flash content but the results can sometimes be less than ideal. You are only seeing a title in the search results for your site because that's the only bit of HTML text that you have outside of your Flash content. You could add a Meta description element to offer more information in HTML. You could also add some other text that's not a part of your Flash content. Just doing this should improve the snippet you see associated with your site in the search results."