What is the right xmlns used for Facebook - facebook

The Facebook documentation on this is inconsistent and confusing.
I. Here: https://developers.facebook.com/docs/reference/plugins/like/
It says:
<html xmlns:fb="http://ogp.me/ns/fb#">
(if you fill in the form and click the get code button and then click the xfbml tag.)
II. On this page: https://developers.facebook.com/docs/technical-guides/opengraph/opengraph-tutorial/#plugins
It says:
xmlns:fb="https://www.facebook.com/2008/fbml"
Now the two urls above resolve to the same url: http://graph.facebook.com/schema/og/
III. But then we have: https://developers.facebook.com/docs/opengraphprotocol/
xmlns:og="http://ogp.me/ns#"
xmlns:fb="https://www.facebook.com/2008/fbml"
Now the first url here does not even resolve to a valid xml schema.
What is the proper xmlns:og to use to enable FBLike, Comment, and FB Connect?
I know that in the world of HTML 5 this is not important but for the case of older browsers, what should we do?
Thanks!

This is what I use at the moment and works perfectly
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:og="http://ogp.me/ns#"
xmlns:fb="https://www.facebook.com/2008/fbml">

This is what I am doing right now. Both namespaces point to the same schema document.
<html xmlns:og="http://ogp.me/ns/fb#" xmlns:fb="http://ogp.me/ns/fb#">

Related

Facebook debugger is seeing code that isn't there

When Facebook debugger scrapes http://www.daisyworld.co.za it says 'Can't Download: Could not retrieve data from URL.' When I click 'See exactly what our scraper sees for your URL', this is what I get:
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head>
<body><p>ÿþ</p></body>
</html>
But what is actually there is:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
<html>
<head>
<META HTTP-EQUIV="content-language" CONTENT="En">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<META HTTP-EQUIV="Content-Style-Type" CONTENT="text/css">
None of the other pages in the domain where I implemented a like button have any problems, it works just fine & I basically used the same pieces of fb code, for all of them with just the different particulars for each page. I cannot figure out what the problem is except that it seems that the debugger is looking at a cached file but surely that isn't supposed to happen?
Maria-Helena
I just hit this issue as well and discovered that facebook's scraper was appearing as a inbound JSON request. Since that particular route was set up to handle both JSON and HTML responses, FB was getting a big gnarly JSON blob instead of the actual web page. Not sure if this solves your exact problem, but hopefully sparks some fresh ideas!
Try saving the file with a different encoding - going from unicode to UTF-8 did it for me.

Can't solve Facebook Open Graph Meta tags not being scraped for my Wordpress site

This is my first time posting a question on this site, but certainly not the first time finding answers in it.
I have used stackoverflow as a resource to fix several issues I've faced with my new blog, that is until last night, when I found this issue which I just can't fix.
When I try to share the home page of my blog, I don't get the proper image specified in the og:image tag... once I check my site via de FB debugger, it shows me this:
https://developers.facebook.com/tools/debug/og/object?q=ivanfuentes.com
Curiously enough, I do not find any issues when I check for a page, or a post:
https://developers.facebook.com/tools/debug/og/object?q=ivanfuentes.com%2Fvideos%2F
https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fivanfuentes.com%2Fthe-popularity-contest%2F
So, I know it's an issue that is generated in the home page only, but during the last 18 hours, have been unable to find it.
I have OG meta tags specified dinamically via a wordpress plugin... currently, it's "Facebook AWD", but I've had several other Facebook sharing, all-in-one's, and OG plugins, which give me the same results in the debugger, which makes me think I messed up somewhere else. I have no embarrassment in admitting I'm quite a newbie, so it's highly likely I messed up while trying to modify some code... probably when I added a few lines to make the site IE compliant?
Hope I gave enough information, and someone gets to help me, as this is not only about the proper image being displayed on a Facebook link, but rather about me likely having a mess in my code, and that could (WILL) mean trouble once I make any mods/updates to my site in the future.
Thanks for the time!
Your html is a complete mess and that's why the debugger is complaining.
Visiting your page and looking at the code I can see this:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<div id="fb-root"></div>
<script>
...
</script>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xmlns:og="http://ogp.me/ns#" xmlns:fb="https://www.facebook.com/2008/fbml">
<head profile="http://gmpg.org/xfn/11">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<div id="fb-root"></div>
<script>
...
</script>
<title>Ivan Fuentes Hagar</title>
Two problems there:
The sdk code is inserted twice
In both cases there's a div placement before the body
In the debugger result for this page when clicking the bottom link (Scraped URL: See exactly what our scraper sees for your URL) you can also see broken html but in another variation:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><body>
<div id="fb-root"></div>
<script>
...
</script><meta http-equiv="content-type" content="text/html; charset=utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<script>
...
</script><title>Ivan Fuentes Hagar</title>
There problems here:
The body definition is right after the html
There's no head definition
All of the tags which suppose to live inside the head are inside the body
The sdk script is loaded inserted twice
In both cases I found 3 occurrences of the <div id="fb-root"></div>.
As you can see you have some fixin' up to do with the html output of your wordpress.
I'm not sure why the outputs is different for the debugger, I thought that maybe due to the user agent string, but trying curl --user-agent "facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)" "http://ivanfuentes.com/" returns the exact results as with the browser.

Facebook behaves weird while sharing my website's urls

I have tried everything I could before posting this question here on stack overflow.
I am unable to understand why Facebook doesn't pick up any related information to posts on hellyalol for example title, thumbnails or description.
This is an example http://hellyalol.com/181/my-date/
All the open graph tags are in place as shown by source code but facebook debugger doesn't pickup any open graph tag.
<meta property='og:title' content='Will you be my date?'/>
<meta property='og:url' content='http://hellyalol.com/181/my-date/'/>
<meta property='og:site_name' content='Hell Ya LOL'/>
<meta property='og:type' content='article'/>
<meta property='og:image' content='http://hellyalol.com/wp-content/uploads/2011/10/fart-exhibit-150x150.jpg'/>
Another big confusion is When I change permalinks in WordPress for example my-date to your-date it surprisingly works.
e.g http://hellyalol.com/195/years-ago/ this one is working just fine while you share it on facebook but still debugger doesn't pick any open graph tags :S but still I changed the permalink twice for this post before it could work with WordPress
Can any one help? Thanks a lot :(
Server Details: I am using Lightspeed and using w3 total cache with memcache enabled.
Make sure to not miss this in your opening html tag:
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:og="http://ogp.me/ns#"
xmlns:fb="https://www.facebook.com/2008/fbml">
This is your current html tag - as you see, some parts are missing (or wrong) there:
<html xmlns="http://www.w3.org/1999/xhtml"
dir="ltr"
lang="en-US"
xml:lang="en-US"
xmlns:og="http://opengraphprotocol.org/schema/">
Just found the answer. For facebook to work with your blog, you should must have www in front of domain. I changed the domain url from http://hellyalol.com to http://www.hellyalol.com and its working

Open Graph validation for HTML5

Is there any way to get facebook's crappy Open Graph meta tags to validate if my doctype is <!DOCTYPE html> (HTML5)?
Other than facebook's Open Graph meta tags, my document validates perfectly.
I really don't want to use <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML+RDFa 1.0//EN" "http://www.w3.org/MarkUp/DTD/xhtml-rdfa-1.dtd"> as that creates a whole new set of problems.
Here is an example of one of the validation errors in question...
Error Line 11, Column 47: Attribute property not allowed on element meta at this point.
<meta property="og:type" content="website" />
Any help would be appreciated... I have been searching off and on for days to no avail.
For HTML5, add this to your html element like described on ogp.me and keep your og: prefixed properties:
<!doctype html>
<html prefix="og: http://ogp.me/ns#">
<head>
<meta property="og:type" content="website" />
...
For XHTML (like OP's question), use the name attribute instead of property attribute. Facebook lint will throw a warning, but the meta value will still be recognized and parsed.
<meta name="og:title" content="Hello Facebook" />
Yes. To validate as HTML5, add the prefix attribute from the Open Graph docs:
<!DOCTYPE html>
<html prefix="og: http://ogp.me/ns#">
<head>
<title>Valid HTML5!</title>
<meta charset="utf-8"/>
<meta property="og:title" content="">
</head>
<body></body>
</html>
Copy and paste the above to the w3 validator to check.
It is production ready – Apple uses this method on apple.com.
The short answer is no, not at this time. All other answers are workarounds, hacks, or just plain crazy. The only long-term solution is that Facebook needs to create an alternate syntax that is valid HTML5.
To those recommending targeting Facebook by the "facebookexternalhit" User Agent, you have to remember that other companies are following Facebook's lead with these tags. For example, Google+ will fall back to the OpenGraph tags if their preferred Schema.org markup isn't present. Since most sites aren’t using Schema.org attributes (especially if they’re spending the time to use OpenGraph correctly), you can easily miss out on enhancing your snippets on sites like Google+ by following this advice.
With the ubiquity of Facebook, it really isn't a good solution to target them directly--even if their choice of implementation is problematic for developers. When looking for solutions on a site like Stack Overflow, you always have to remember that there can be unforeseen consequences to these methods.
For our main sites, we've stuck with XHTML+RDFa for validation sake, and it's worked well enough. I'm hoping that as HTML5's usage grows, the Facebook team will start accepting a valid format for this metadata.
As for why we care about validation:
We've found that validation, when possible, helps to alert us to errors in our pages by not teaching us to ignore them. Since we all use validation extensions in our browsers, we know instantly if there's a validation error (or warning) on a page, and can investigate whether it's possible to eliminate it (which 99+% of the time it is). This saves us time dealing with restrictive implementations of the specs, especially on fringe and mobile platforms nowadays. We've seen a huge reduction in odd bugs because we're aware of our pages being valid and know that what's going on in the browser doesn’t have to do with invalid markup that a particular UA might not interpret as expected.
These meta tags are only required when facebook scans the page for these tags.
<?
if(eregi("facebookexternalhit", $_SERVER['HTTP_USER_AGENT'])){
echo '<meta property="og:type" content=xxxxxxxxxxxxx';
// continue with the other open graph tags
}
?>
The said tags will only be present when facebook needs them - this method with PHP removes them completely for all other instances including W3C validation.
Many of the answers here have become outdated. Please don't snoop for headers or write via JavaScript (since the processors might not evaluate the JS).
The W3C Recommendations (Extensions to HTML5) called RDFa 1.1 and RDFa Lite 1.1 (see http://www.w3.org/TR/rdfa-lite/ and http://www.w3.org/TR/rdfa-primer/ ) have made the "property" attribute valid and conforming. In the mean time (since the older answers here) the validator http://validator.w3.org/check recognizes the attribute as valid. In addition, the Open Graph Protocol documentation, http://ogp.me/ , has been updated to reflect RDFa 1.1 (it uses the "prefix" attribute).
The W3C work has been done with input from OpenGraph and schema.org among others to resolve the kind of issue raise by this question.
In short, make sure your OG tags conform to RDFa and you are golden.
More than a Year has passed and the best solution we've got is to wrap the meta tags in some sort of server-side verification.
In PHP I did:
<?php if (stristr($_SERVER["HTTP_USER_AGENT"],'facebook') !== false) { ?>
<meta property="og:title" content="Title of the page" />
<meta property="og:url" content="http://www.example.com/" />
<meta property="og:type" content="website" />
<meta property="fb:admins" content="123456789" />
<meta property="og:image" content="http://www.example.com/images/thumb.jpg" />
<?php } ?>
It really works for Facebook. But I really don't like this idea!
One recent solution is to register a prefix in the html or head tag:
<html prefix="og: http://ogp.me/ns# fb: http://ogp.me/ns/fb#">
or
<head prefix="og: http://ogp.me/ns# fb: http://ogp.me/ns/fb#">
taken from here - sorry, page is in german...
Bad solution for the meta tags. If you wrap those in Javascript then the Facebook Linter won't find them. That's the same as not putting them in at all.
Wrapping like buttons and such in script works to help validate against XHTML 1.0 but not HTML5.
In JSP:
<%
String ua=request.getHeader("user-agent").toLowerCase();
if(ua.matches(".*facebookexternalhit.*")){
}
%>
<meta property="og:image" content="images/facebook.jpg" />
...
<%
}
%>
Or:
<c:set var="ua" value="${header['User-Agent']}" scope="page"/>
<c:if test="${ua.matches('.*facebookexternalhit.*')}">
<meta property="og:image" content="images/facebook.jpg" />
...
</c:if>
Well, Visual Studio 2011 tells me that the "property" attribute is invalid. However, the W3C seems to be a little more lenient:
http://validator.w3.org/check?uri=http%3A%2F%2Fpacificfoods.com%2F
You'll notice that that I added Open Graph tags per Facebook's recommendation to that site, and it does not break the W3C validator, which I consider to be authoritative.
Consulting the official W3C HTML5 specification for the meta tag, it is clear that the use of the "property" attribute (in lieu of the "name", "http-equiv", "charset", or "itemprop" attributes) is not valid. However, their validator validates it (???). I have no explanation for this discrepancy.
I would be inclined to say don't worry about validation, I don't believe having invalid mark up will hurt your search engine ranking. e.g. googles technical recommendations do not mention standards. http://www.google.com/support/webmasters/bin/answer.py?answer=35769#2 . Html5 allows you provide more information to search engines which they can then use, but I can't see them down ranking based on not validating.
However if you feel it helps you to validate you can use
<script>document.write('<meta property="og:type" content="website" />')</script>
to have these tags present and have a html file that will pass validators.
Although it will cut off non-Javascript users, I've used this
<script type="text/javascript">
//<![CDATA[
document.write('<fb:like href="" send="false" layout="button_count" width="100" show_faces="true" font=""></fb:like>')
//]]>
</script>
and it validated perfectly. It shows and works fine with Firefox, Opera, IE, Chrome, Safari on Windows, and with Firefox, Opera, Safari on Mac.

facebook doctype, is it being deprecated?

I've recently started working on a facebook app and I am lost and confused trying to get any direct answer. Googling for answers don't seem to work too well with facebook as things are changing too fast and blogs rarely update it to reflect the new APIs.
I know Facebook will deprecate FBML and I am overjoyed by that.
So my question is, will the DOCTYPE facebook gave you to put on your site, being deprecated?
What does doing this even do?
Assuming you are talking about this:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xmlns:fb="http://www.facebook.com/2008/fbml">
(specifically the xmlns:fb="http://www.facebook.com/2008/fbml"). They are not deprecating that. You should continue to add xmlns:fb="http://www.facebook.com/2008/fbml" in your <html> tag. However, for modernity you should be using the HTML5 doctype, so your doctype and HTML tag should look like this:
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml" xmlns:fb="http://www.facebook.com/2008/fbml">
What the xmlns:fb part does is specifies the XML namespace for tags within the page which begin with the fb: prefix (such as <fb:name/>).