language og:locale:alternate What are the fundamentals - It seems to be important? - facebook

Google definitely uses these meta tags. I tested a page here and when added, the tool picked up the data..... so the og (Open Graph) tags are very important for Google search.
Therefore we need to understand them thoroughly. However, visiting ogp.me and loading their specification page produces a blank page (whats going on there I wonder)?
Their single info webpage tells us:
The following properties are optional for any object and are generally recommended:
og:locale - The locale these tags are marked up in.
Of the format language_TERRITORY. Default is en_US.
og:locale:alternate - An array of other locales this page is available in.
They give us an example:
<meta property="og:locale" content="en_GB" />
<meta property="og:locale:alternate" content="fr_FR" />
<meta property="og:locale:alternate" content="es_ES" />
The theory on the web is that 'locale' determines the region the document is applicable to (available in), and 'alternate' provides additional document applicable regions.
However, for a great number of sites, this seems to be a disaster in waiting.
One chap on Bing SEO claimed that for 'locale' anything other than en_US would kill your doc distribution (globally), yet what then of French, Spanish, German, Japanese etc.
My site has translation enabled and its content is genuinely applicable to a global audience.
Am I to list the entire array of languages.... and by doing so, risk accusation of 'spamming regions'?
We have no in depth explanation of these tags, yet they seem too important to guess at.
Does anybody have knowledge of the fundamentals, so that we can code our sites correctly?

These tags are for Facebook only. This has no effect on Google. They have their own recommended was for handling locale and languages.
http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
http://googlewebmastercentral.blogspot.com/2010/03/working-with-multi-regional-websites.html
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192

Related

Schema dot org and alternate languages

I have schema dot org markup on my website. But I also have an alternate language; each of my pages has a French version in a different page with proper hreflang tags.
Google's instructions don't really mention different languages, neither does schema dot org. For example, I have an "Organization" schema set up on the homepage. Do I need to translate it on the French homepage or leave it in English, and if so, do I change the URL to point to the French homepage as well? Wouldn't this cause Google to think there are two different organizations? Same question would apply to schemas like "Product".
hreflang not directly related to schema.org (That's why you didn't find any references on google/schema.org).
Schema.org is a set of extensible schemas that enables webmasters to
embed structured data on their web pages for use by search engines and
other applications. https://schema.org/
VS
Hreflang specifies the language and optional geographic restrictions
for a document. Hreflang - Google Support. The hreflang attribute on each page should include a reference to itself as well as to all the pages that serve as alternates for it https://moz.com/learn/seo/hreflang-tag.
Two pages example
**microdata (Same idea for JSON-LD). And the same idea to any schema.
Your English version
/en/about
<div itemscope itemtype="http://schema.org/LocalBusiness">
<h1><span itemprop="name">Hello World</span></h1>
<p itemprop="description">A superb collection of fine gifts and clothing
</div>
hreflang:
<link rel="alternate" href="http://example.com/en/about" hreflang="en" />
<link rel="alternate" href="http://example.com/fr/about" hreflang="fr-fr" />
Your French version
/fr/about
<div itemscope itemtype="http://schema.org/LocalBusiness">
<h1><span itemprop="name">Bonjour le monde</span></h1>
<p itemprop="description">Une superbe collection de beaux cadeaux et vêtements
</div>
hreflang:
<link rel="alternate" href="http://example.com/en/about" hreflang="en" />
<link rel="alternate" href="http://example.com/fr/about" hreflang="fr-fr" />
itemprop="name" above give extra semantic data about your LocalBusiness - each page use another language (Specify by Hreflang).
One of google guideline is:
Don't mark up content that is not visible to readers of the page. For
example, if the JSON-LD markup describes a performer, the HTML body
should describe that same performer. https://developers.google.com/search/docs/guides/sd-policies
Not official google answer about this topic - but its better to translate the JSON-LD data as well. By Wordpress or other CMS, it should be easy to pull the data.
Anyway, JSON-LD not related to site indexing (like hreflang -or- canonical). There is no need to change a URL because of a schema. You find reports (status/errors/rich results) about your schema under google search console - docs her.
Live example (From nike site):
English schema (rich card preview):
Data Testing tool
French schema (rich card preview):
Data testing tool
Follow the structured data guidelines of Google requires:
Relevance
Your structured data should be a true representation of the
page content.
as well as further
Location
Put the structured data on the page that it describes, unless
specified otherwise by the documentation. If you have duplicate pages
for the same content, we recommend placing the same structured data on
all page duplicates, not just on the canonical page.
Thus, if the information on your home page has a separate web page with duplicate content in French, then using structured data, you MUST set the content for data in French.
This is completely justified in terms of semantics. Google uses structured data to search for entities with API Google Knowledge Graph, for rich search results, for voice search, for machine learning. It is obvious that users using French in a web search is willing and will receive search results in French.

Facebook forcing Chinese characters in shares

I'm working on a site and a good portion of the URLs, but not all, are forcing to show Chinese characters in the URL description for Facebook shares (doesn't appear to happen with any other social media shares).
I've gone through everything I can find to help declare English as the site's language.
HTML & Open Graph:
<html lang="en" hreflang="en-us" >
<meta property="og:locale" content="en_US" />
Facebook Script call includes English:
js.src = "//connect.facebook.net/en_US/sdk.js#xfbml=1&version=v2.8&appId=161005447317900";
Though, when I use either the Share button built onto the page (part way down on the right) or paste the page's URL in Facebook it converts/interprets my characters into Chinese.
Example Screenshot:
Not sure it's worth noting, but I have attempted to change my charset from 8 to 16, but that didn't change anything.
Here a page on the live site with the problem.
In my case the problem was being caused by a <meta charset="UTF16"> declaration in the .
I had actually suspected that after finding some others commenting on this as a possibility in other Stackoverflow questions. However, I didn't catch it as Facebook's servers seem to keep cached responses/versions, so when you change it to utf-8, as an example, the link will still display the same results at times.
I decided to change it to utf-8 and let it sit for a couple of days and then come back and check - that did the trick.

Google Rich Snippets .... for a recipe

I'm having a bit of a problem getting rich snippets shown properly on the testing tool for my site (I understand that Google can take some time / decisions as to if a result shows up on the actual site).
Here's an example Google Rich Snippets Result for one of my pages:
http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.makemeacocktail.com%2Fcocktail%2F6741%2Fcosmopolitan%2F
Which is for this url:
http://www.makemeacocktail.com/cocktail/6741/cosmopolitan/
Everything looks good - but for some reason no image is shown with the example result. I'm a bit confused here. I have the correct meta tag in place:
<meta itemprop="image" content="http://images.makemeacocktail.com/cocktails/6741/cosmo_4.jpg" class="photo" />
But no image shown in the testing tool result?
As a side - is there anything else that needs including? It does seem to have everything that is required, but can't seem to get the image shown properly.
Not entirely true ... but it did lead me to the solution. For completeness here and if anyone wants to know a little more .....
I was using:
itemscope itemtype="http://schema.org/Recipe"
Where as google in their example pages:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=173379
Uses:
itemscope itemtype="http://data-vocabulary.org/Recipe"
Note the different itemtypes. Image was the correct itemtype for schema.org (not for data-vocabulary.org). If I changed my meta tag from:
<meta itemprop="image" content="http://images.makemeacocktail.com/cocktails/6741/cosmo_4.jpg" class="photo" />
to
<meta itemprop="photo" content="http://images.makemeacocktail.com/cocktails/6741/cosmo_4.jpg" class="photo" />
Google threw up errors that itemprop="photo" wasn't recognised in the schema.org.
The actual reason my page wasn't showing up a photo in the search results was because I had also used Microdata (I was using the Microdata hrecipe. Schema.org is a Microformat by the way).
My Microdata for the page was correct, but the class="photo" that is needed for Microdata was on a meta tag, where as it turns out you actually need the class="photo" on an actual 'img' tag for google to recognise it.
This also leads on to show that Google actually reads Microdata before Microformat - and gets me wondering whether having both on one page is useful at all. Ie Google was not recognising my correct schema.org image meta tag if hrecipe was present and the photo class not applied properly. So I wonder if Google use the Microformat information at all if Microdata is present.
A couple of open ended questions, but also the answer to the initial question. Hope that helps someone else in the future.
The image property in the recipe markup is called photo and not image:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=173379
Try replacing itemprop="image" with itemprop="photo" and you should be all set.

Translation not working with Preview-Stories in Open Graph Timeline-View

We are working on an app that has the locales german and english. Our app creates open graph actions like "[User] offered [Object] on [App]".
I have translated all our strings in facebooks translation app, my facebook account is set to german and our app runs with the default locale german.
When we post to the graph, the unique og:url for the action/object links to a site with the following html:
<meta content='de_DE' property='og:locale'>
<meta content='de_DE' property='og:locale:alternate'>
<meta content='en_US' property='og:locale:alternate'>
Our app is able to serve the content in different languages when called with the locale-param, e.g. "?locale=en-US" or "?locale=de-DE".
I have seen correctly translated text in the ticker, that seems to work. Also, the preview popups that come up from the ticker messages are correctly translated.
Now the Problem: the aggregation preview in the Timeline does always show the english action types. But when i click into the aggregation and i only see the App's "All time" aggregation, there the translation is correct.
https://skitch.com/florian2/gse8h/the-problem
[Picture 1] so thats my problem, that "searched" should be a "sucht", that "offered" should be "bietet an".
https://skitch.com/florian2/gse1n/the-problem-4
[Picture 2] here you can see a line from my activity log, that should be in german, too. The second line is from the translation tool and looks like that it should be the correct translation for the log-line, right?
What am i doing wrong here?
As far as I understood it, the translation strings need to be approved somehow by the community through the translation app.
as i can see from the first screenshot, the translation in the second one is wrong. It should be something like
searched {flat1}
The aggregation do not have {name} and {Flatorious} in it.
Greets
P.S.: Hugo, if you are the dev or admin of the App, you do not need to approve the translations. Only if someone third party is translating.

FB OpenGraph og:image not pulling images (possibly https?)

Facebook cannot grasp my og:image files and I have tried every usual solution. I'm beginning to think it might have something to do with https://...
I have checked http://developers.facebook.com/tools/debug and have zero warnings or errors.
It is finding the images we linked to in the "og:image", but they're showing up blank. When we click the image(s), however, they DO exist and it takes is straight to them.
It DOES show one image -- an image hosted on a non-https server.
We've tried square images, jpegs, pngs, larger sizes and smaller sizes. We've put the images right in public_html. Zero are showing up.
It's not a caching error, because when we add another og:image to the meta, FB's linter does find and read that. It DOES show a preview. The preview is blank. The only exception we're getting is for images that are not on this website.
We thought maybe there was some anti-leach on cpanel or the .htaccess that was preventing the images from showing up, so we checked. There was not. We even did a quick < img src="[remote file]" > on an entirely different server and the image shows up fine.
We thought maybe it was the og:type or another oddity with another meta tag. We removed all of them, one at a time and checked it. No change. Just warnings.
The same code on a different website shows up without any issue.
We thought maybe it was not pulling images because we're using the same product page(s) for multiple products (changing it based on the get value, ie, "details.php?id=xxx") but it's still pulling in one image (from a different url).
Leaving any og:image or image_src off, FB does not find any images.
I am at the end of my rope. If I said how much time myself and others have spent on this, you'd be shocked. The issue is that this is an online store. We absolutely, positively cannot NOT have images. We have to. We have ten or so other sites... This is the only one with og:image problems. It's also the only one on https, so we thought maybe that was the problem. But we can't find any precedent anywhere on the web for that.
These are the meta-tags:
<meta property="og:title" content="[The product name]" />
<meta property="og:description" content="[the product description]" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-art-black.png" />
<meta property="og:image" content="http://www.[ADIFFERENTwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/ARShopHeader.png" />
<meta property="og:image" content="http://www.[ourwebsite].com/overdriven-blues-music-tshirt-art-black.JPG" />
<meta property="og:type" content="product"/>
<meta property="og:url" content="https://www.[ourwebsite].com/apparel-details.php?i=10047" />
<meta property="og:site_name" content="[our site name]" />
<meta property="fb:admins" content="[FB-USER-ID-NUMBER]"/>
<meta name="title" content="[The product name]" />
<meta name="description" content="[The product description]" />
<link rel="image_src" href="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta name="keywords" content="[four typical keywords]">
<meta name="robots" content="noarchive">
In case you want it, here's a link to one of our product pages that we've been working on. [Link shortened to try to curb this getting into search results for our site]: http://rockn.ro/114
EDIT ----
Using the "see what facebook sees" scraper tool, we were able to see the following:
"image": [
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-details-safari.png"
},
{
"url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-art-safari.png"
},
{
"url": "http://www.[theotherNONSECUREwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png"
}
],
We tested all links it found for a single page. All were perfectly valid images.
EDIT 2 ----
We tried a test and added a subdomain to the NONSECURE website (from which images are actually visible through facebook). Subdomain was http://img.[nonsecuresite].com. We then put all images into the main subdomain folder and referenced those. It would not pull those images into FB. However, it would still pull any images that were referenced on the nonsecure main domain.
POSTED WORKAROUND ----
Thanks to Keegan, we now know that this is a bug in Facebook. To workaround, we placed a subdomain in a different NON-HTTPS website and dumped all images in it. We referenced the coordinating http://img.otherdomain.com/[like-image.jpg] image in og:image on each product page. We then had to go through FB Linter and run EVERY link to refresh the OG data. This worked, but the solution is a band-aid workaround, and if the https issue is fixed and we go back to using the natural https domain, FB will have cached the images from a different website, complicating matters. Hopefully this information helps to save someone else from losing 32 coding hours of their life.
Some properties can have extra metadata attached to them. These are specified in the same way as other metadata with property and content, but the property will have extra :
The og:image property has some optional structured properties:
og:image:url - Identical to og:image.
og:image:secure_url - An
alternate url to use if the webpage requires HTTPS.
og:image:type - A
MIME type for this image.
og:image:width - The number of pixels wide.
og:image:height - The number of pixels high.
A full image example:
<meta property="og:image" content="http://example.com/ogp.jpg" />
<meta property="og:image:secure_url" content="https://secure.example.com/ogp.jpg" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
So you need to change og:image property for your HTTPS URLs to og:image:secure_url
Ex:
HTTPS META TAG FOR IMAGE:
<meta property="og:image:secure_url" content="https://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
HTTP META TAG FOR IMAGE:
<meta property="og:image" content="http://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
Source: http://ogp.me/#structured <-- You can visit this site for more information.
EDIT: Don't forget to ping facebook servers after updating your code - URL Linter
I ran into the same problem and reported it as a bug on the Facebook developer site. It seems pretty clear that og:image URIs using HTTP work just fine and URIs using HTTPS do not. They have now acknowledged that they are "looking into this."
Update: As of 2020, the bug is no longer visible in Facebook's ticket system. They never responded and I don't believe this behavior has changed. However, specifying HTTPS URI in og:image:secure does seem to be working fine.
I don't know, if it's only with me but for me og:image does not work and it picks my site logo, even though facebook debugger shows the correct image.
But changing og:image to og:image:url worked for me. Hope this helps anybody else facing similar issue.
tl;dr – be patient
I ended up here because I was seeing blank images served from a https site. The problem was quite a different one though:
When content is shared for the first time, the Facebook crawler will scrape and cache the metadata from the URL shared. The crawler has to see an image at least once before it can be rendered. This means that the first person who shares a piece of content won't see a rendered image
[https://developers.facebook.com/docs/sharing/best-practices/#precaching]
While testing, it took facebook around 10 minutes to finally show the rendered image. So while I was scratching my head and throwing random og tags at facebook (and suspecting the https problem mentioned here), all I had to do was wait.
As this might really stop people from sharing your links for the first time, FB suggests two ways to circumvent this behavior:
a) running the OG Debugger on all your links: the image will be cached and ready for sharing after ~10 minutes or b) specifying og:image:width and og:image:height. (Read more in the above link)
Still wondering though what takes them so long ...
Got here from Google but this wasn't much help for me. It turned out that there is a minimum aspect ratio of 3:1 required for the logo. Mine was almost 4:1. I used Gimp to crop it to exactly 3:1 and voila - my logo is now shown on FB.
I had similar problems. I removed the property="og:image:secure_url" and now it will scrub with just og:image. Sometimes, less is more
I had the same error and nothing of previous have helped, so I tried to follow original documentation of Open Graph Protocol and I added prefix attribute to my html tag and everything became awesome.
<html prefix="og: http://ogp.me/ns#">
As I accidentally found, transparent blank image comes with response header indicating possible cause of the problem.
Go to the debugger at https://developers.facebook.com/tools/debug/og/object/
Put your URL
In the bottom, facebook shows your "image" (transparent 1x1 GIF)
Image is linked to your original image - no point pressing it
Press right and view image (you'll get something like https://external-ams3-1.xx.fbcdn.net/safe_image.php?d=...&url=...)
Turn on Net tab on firebug/developer tools, refresh page if needed
You'll get x-error-detail response header with explanation
For example, in my case it was Invalid image extension for URL: https://[mydomain]/[myfilename].jpg
The real issue in my case was related to prerender.io.
As it turns out, if image is requested via prerender, it's converted to HTML. Something like this:
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head>
<body style="margin: 0px;"><img style="-webkit-user-select: none; cursor: -webkit-zoom-in; " src="https://[yourdomain].com/[yourfilename].jpg" width="1078" height="718"></body>
</html>
It's either bug in prerender itself, or it's supposed to be configured in your proxy to not use prerender for *.jpg requests (even if they are requested by Facebook bot).
It's really hard to notice this, as prerender is used only on certain user-agent headers.
In my case the problem was in not providing CA Root Certificate. I figured it out after using: https://www.ssllabs.com/ssltest/analyze.html to analyze SSL configuration.
I took http:// out of my og:image and replaced it with just plain old www. then it started working fine.
You can use this tool, by Facebook to reset your image scrape cache and test what URL it is pulling for the demo image.
I have a Wordpress site that uses og:image with an https URL to the image and the images show up just fine in Facebook preview links.
I have another site I was working on that uses og:image with an https URL and sometimes the images would appear and sometimes they wouldn't. I tried the suggestions on this page, using og:image:url and og:image:secure_url and neither one made any difference, the image wouldn't be used for the preview.
Both sites have valid https certificates, so it wasn't a certificate problem.
After searching some more I found out that Facebook has a MINIMUM SIZE for images. If the og:image is less than 200x200px it will not be used by Facebook. The recommended size is 600x600px for stories and 1200x630px for everything else.
I scaled up the image sizes on my second site and they started appearing on Facebook. Mystery solved.
Hope you find this useful.
I discovered another scenario that can cause this issue. I went through all the steps described in the question and the answers, still the problem remained.
I checked my images and found that some of my posts had way too large thumbnail images in og:image in the range of several thousand pixels and several megabytes.
This happened due to the recent migration from WP to Jekyll, I optimized my images with gulp, but used the original images in og:image by mistake.
Facebook gives us the following recommendations as of today:
Use images that are at least 1200 x 630 pixels for the best display on
high resolution devices. At the minimum, you should use images that
are 600 x 315 pixels to display link page posts with larger images.
Images can be up to 8MB in size.
So there is an upper limit of 8MB.
I ran into the same issue and then I noticed that I had a different domain for the og:url
Once I made sure that the domain was the same for og:url and og:image it worked.
Hope this helps.
Similar symptoms (Facebook et al not correctly fetching og:image and other assets over https) can occur when the site's https certificate is not fully compliant.
Your site's https cert may seem valid (green key in the browser and all), but it will not scrape correctly if it's missing an intermediate or chain certificate. This can lead to many wasted hours checking and rechecking all the various caches and meta tags.
Might not have been your problem, but could be other's with similar symptoms (like mine). There's many ways to check your cert - the one I happened to use: https://www.sslshopper.com/ssl-checker.html
I can see that the Debugger is retrieving 4 og:image tags from your URL.
The first image is the largest and therefore takes longest to load.
Try shrink that first image down or change the order to show a smaller image first.
In addition, this problem also occurs when you add a user generated story (where you do not use og:image). For example:
POST /me/cookbook:eat?
recipe=http://www.example.com/recipes/pizza/&
image[0][url]=http://www.example.com/recipes/pizza/pizza.jpg&
image[0][user_generated]=true&
access_token=VALID_ACCESS_TOKEN
The above will only work with http and not with https. If you use https, you will get an error that says:
Attached image () failed to upload
Don't forget to refresh servers through :
Facebook Debugger
And click on "Collect new info"
Had a similar problem today, which the Sharing Debugger helped me solve. It seems that Facebook can’t (currently) understand images with XMP metadata embedded. When I replaced the images on our articles with versions without XMP metadata, and re-scraped the page (using the Sharing Debugger), the problem went away. A hex editor will help you see whether or not your image contains XMP metadata.
OK... I realize this thread is old and overcrowded, but in case someone comes in like I did struggling to get their og:image tag to work right in Facebook, here's the trick that worked for me:
do NOT use this link:
https://developers.facebook.com/tools/debug/sharing/?q=https%3A%2F%2Fwww.google.com
to work through your problem. Or if you do, immediately scroll down to the bottom and click on Scrape VIA API.
https://developers.facebook.com/tools/explorer/?method=POST&path=%3Fscrape%3Dtrue%26id%3Dhttps%3A%2F%2Fwww.google.com&version=v5.0
There are errors displayed in the explorer tool that are NOT shown in the "debug" tool. Maddening!!! (in my case, a space in the image filename knocked my image out silently in the debug tool, but it showed the error in the explorer tool).
I came across another reason for og images not to display on FB cards. Furthermore, using the FB scraper tool to debug the og meta tags, I could confirm all the required tags where present in my WordPress page, and yet I would get the following file download error,
Provided og:image, < https-link-to-jpg-image > could not be downloaded.
This can happen due to several different reasons such as your server
using unsupported content-encoding. The crawler accepts deflate and gzip content encodings.
I had a vague feeling that the image format had an issue, the link to the image was working but the message seems to indicate something amiss with the content-encoding.
After much searching, I ended up looking at the php extensions that are required for a WordPress server, and realised the pho-exif module was not installed. The exif module writes exif metadata to all uploaded images. As a result the images used in the FB og image tag did not have any exif metadata associated with them.
Once the exif module is enabled, WordPress allows exif metadata to be reset for an image (Media library->select and image->Edit more details->Map exif metadata) and the image now appeared on the FB card as expected.
In my case, it seems that the crawler is just having a bug. I've tried:
Changing links to http only
Removing end white space
Switching back to http completely
Reinstalling the website
Installing a bunch of OG plugins (I use WordPress)
Suspecting the server has a weird misconfiguration that blocks the bots (because all the OG checkers are unable to fetch tags, and other requests to my sites are unstable)
None of these works. This costed me a week. And suddenly out of nowhere it seems to work again.
Here are my research, if someone meets this problem again:
What makes Open Graph checkers unable to detect Open Graph data?
How to know what bots of a website, if I have no root access to the hosting they will read?
👍 What makes Open Graph checkers unable to detect Open Graph data? - Let's Encrypt Community Support
👍 Crawler is unable to fetch images, but adding a brand new, unique query string can make it work for one first time - Facebook for Developers
Also, there are more checkers other than the Facebook's Object Debugger for you to check: OpenGraphCheck.com, Abhinay Rathore's Open Graph Tester, Iframely's Embed Codes, Card Validator | Twitter Developers.
I arrived here because an updated facebook meta tag image wasn't showing on facebook shares.
For anyone else in this predicament, the reason was simply that you need to ask facebook to scrape your site again.
Once you do that, it will appear as expected.
I'm using cloudfront distributions pointing to s3 bucket to serve static images...my cloudfront origins are set to redirect http to https...so maybe that has something to do with it?
regardless...
Updating og:image from https to http resolved the issue for me, images are now being posted to facebook posts with links to my site.
UPDATE: the above behavior continued to happen...anytime I were to change the og:image url, or invalidate my cloudFront cache, the image would work on the FB debugger, but the image would never show up on FB.
I added a new behavior for my og:image endpoint and set min ttl, max ttl, and default ttl to 0. And now everything is working great...not ideal as I'd prefer it to be cached, but apparently FB can't handle the cloudfront 304 response?
I had the same issue and the cause was the minimum TLS version specified in Cloudflare:
If I set minimum TLS to 1.3 - no meta images. If I set it to 1.2 or lower - meta images appear.
It seems that social media previews don't support TLS 1.3, hence the issue. For the record, I have no og:image:secure_url and have HTTP redirected to HTTPS. The site is completely not accessible via HTTP. Only the TLS version was causing trouble.
I struggled to find an answer to this and was getting this puzzling error from LinkedIn:
We encountered an SSL connection error while trying to access the URL.
Please check that the site is using a prime size that is compatible
with Java 8, or contact Support with the content's URL.
The answer was, even though I had both TLSv1.2 and TLSv1.3 enabled in nginx, TLSv1.2 wasn't available due to my cipher list as verified by this checker. It appears facebook and linkedin both use TLSv1.2 to generate previews (as of Nov 2022).
I had to update nginx to the following according to the first answer on this post:
ssl_protocols TLSv1.3 TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers "EECDH+AESGCM,EDH+AESGCM";
If your image links look like this:
"https://someurl Wed Sep 14 2022 05:59:25 GMT+0000 (Coordinated Universal Time).jpg"
Then make sure that you are using encodeURI function (JavaScript) or any similar function in other languages while setting the URL.
This will help you to create a valid URL which can be understood by og:image.
{
'og:title': "title",
'og:description': "description",
'og:image': encodeURI(image),
'og:image:secure_url': encodeURI(image),
}
From what I observed, I see that when your website is public and even though the image url is https, it just works fine.
For me this worked:
<meta property="og:url" content="http://yoursiteurl" />
<meta property="og:image" content="link_to_first_image_if_you_want" />
<meta property="og:image" content="link_to_second_image_if_you_want" />
<meta property="og:image:type" content="image/jpeg" />
<meta property="og:image:width" content="400" />
<meta property="og:image:height" content="300" />
<meta property="og:title" content="your title" />
<meta property="og:description" content="your text about homepage"/>
After several hours of testing and trying things...
I solved this problem as simple as possible.
I notice that they use "test pages" inside Facebook Developers Page that contains only the "og" tags and some text in the body tag that referals this og tags.
So what have i done?
I created a second view in my application, containing this same things they use.
And how i know is Facebook that is accessing my page so i can change the view? They have a unique User Agent: "facebookexternalhit/1.1"
Once you update the meta tag make sure the content(image) link is absolute path and
go here https://developers.facebook.com/tools/debug/sharing enter you site link and click on scrape again in next page