This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Facebook won’t share a link to my site
I have 2 websites that fail to show an image when pasted into facebook. So I went to the facebook object debugger and compare what the scraper sees to what view source sees.
http://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fvspwebandvideo.com%2F
Both of my failing pages dies on the line:
Because it gives no error, it just stops reading at that point, I am clueless as to what to try.
Any ideas? I wondered if the title tag having html entities would have an effect.
paxtonsgrill.com fails as well, but allaroundloveland.com works. All 3 are wordpress sites, but I am a PHP developer so if I can figure out what is wrong, I can most likely fix it.
Thanks
The problem is with your site's Charset.
If you click on See exactly what our scraper sees for your URL link on their page, You should notice their scraper breaks at <link rel="alternate" type="application/rss왩"> (After rss its actually a + on your site) and checking your site's source code, I see you have utf-7 charset.
So i did a quick test and i can confirm its the character encoding issue. Just change it to utf-8 and everything should be working fine.
Also if you check your allaroundloveland.com site's source code, you can see it has the right charset but whereas your other sites have wrong charset (Idk if you added that yourself for some reason? but their scrapers are breaking with that charset).
Hope this helps and let me know if its working for you.
Related
I run a Wordpress blog. Here's the URL to our main page: http://theoriginalsocitey.com
For some reason, when links to posts on our site are shared, the featured image is not correctly detected. When I put a link to a post through the Facebook debugger, some of the usual minor errors come up that do for all sites, including those that work perfectly. I've tried fixing the "tags in body" error with success, but it didn't fix the problem, so it is not the issue.
All other OG tags are displaying correctly, just the og:image has the issue. The debugger even acts like it knows what image to use, but refuses to do so. It will say the correct image but say "the image is not large enough" when it clearly is, for all posts.
Can someone please help me figure this out? It's been days of countless different attempts of fixing this including plugin deactivation, reactivation and new installations and nothing has worked.
Here's an example of a post URL to use for debugging: http://theoriginalsociety.com/quest-castro-jon-bellion-logic-24-freestyle/
It seems like the open graph image is too small (150x150), therefor Facebook is using the larger image it found on the page. It's in your best interest to give Facebook as large an image as possible.
[https://developers.facebook.com/docs/opengraph/howtos/maximizing-distribution-media-content/][1]
og:image – This is an image associated with your media. We suggest
that you use an image of at least 1200x630 pixels.
Try providing a much larger image and see if that solves the problem.
I wanted to see if anyone else had ever had an issue with Mobile Safari or Chrome causing web pages to suddenly spit out a ton of garbage.
The issue occurs when I visit the site & refresh multiple times. Suddenly, none of the content renders correctly, but instead looks more like the type of glyphs you'd see in Microsoft Word.
Has anyone ever seen this before and, if so, how did you resolve the problem?
I have seen it on iPhone and Android. On iPhone I suspect its due to the interruption during the page load (getting kCFURLErrorCancelled on the didFailLoadWithError method). Though I am still looking to confirm this issue and still looking for a solution ...
You should check the character set of the HTML page. Bases on your language or the special characters used in the page, you should use the apt charset. You can learn more about charsets here
http://www.w3schools.com/tags/ref_charactersets.asp
http://en.wikipedia.org/wiki/Character_encodings_in_HTML
http://en.wikipedia.org/wiki/UTF-8
It is really difficult to tell anything from what you have posted. You should check the meta tags of your page. I would suggest you validate the html source and css of the page here .
I have already checked out this question, and it sounds like he's describing the same exact problem as me except for a few things:
I'm not running on https
80% of the time I try to debug, I get this message " Error parsing input URL, no data was scraped."
The scraper works perfectly on a different domain, but same server, same theme with almost identical content. Every time I try a domain it scrapes it perfectly including the image
During the 20% that it actually scrapes my page, I am having the same issue in the above link. It is reading my thumbnail, yet showing a blank image. The link brings me to a working image but it doesn't want to show anything.
The weird part is it worked completely fine about 10 months ago when I updated this blog on a daily basis. The only difference is I've switched servers recently. While that would explain a possibility, the other domain switched as well and doesn't have this problem.
I am at a loss why my links either show no image at all in facebook or give me the:
Domain Link
Domain
(no image, no description)
Very frustrating situation. Does anyone have any suggestions?
Update:
I have 6 domains...
When I moved servers recently, I found the new server wasn't prepared to compress the pages, so my blog posts looked crazy. This forced me to turn compression 'off' on WP Super Cache on my main blog. I also did it to my 2nd highest traffic blog figuring I'd get to the other 4 later.
Well, now those first two blogs appear to work fine in the facebook debugger, but the remaining 4 have troubles. The tricky part is, I completely removed WP Super Cache from one site and still had trouble fetching the data.
So while it seems logically it should have been a WP Super Cache issue, continuing to have errors despite removing it leads me to believe now? I'm still so baffled.
Update:
Ok, I loaded Chrome and IE, and both were able to pull the data with ease. The google snippet tool also worked great. I am going to try posting a link to my facebook fan page via chrome and see if it works correctly.
I did clear my FF cache and it didn't change, but I am still confused why one domain works ok while the other does not. Either way, if adding in Chrome works, I'll stick with that for now.
Any other suggestions?
Cache should not make any problem. If a browser can see your page, so can facebook debugger.
See if some 500 error is there. Try from different browser, clearing the browser cache etc. Try google rich snippet and see if a custom search engine is scrapping it fine.
PS: It will be nicer if you post url.
I've added the required meta tags (as per http://developers.facebook.com/docs/insights/) on a test environment, but when I use the dashboard to claim the domain, it gives an error indicating that no admin data could be found at the root.
Any idea what I've done wrong / why it isn't working?
See test site here: http://www.test.bbc.co.uk
Thanks very much,
Aodh.
You don't have an fb:admins meta tag there, there's an fb:page_id tag but that's deprecated and if it worked for you before, it will stop doing so very soon - it was scheduled for removal on May 2nd 2012
BTW, you really should have included your meta tags in the question, it's quite difficult to see the issue from the question you've asked, it could easily have been closed as off-topic
This might sound like a silly question, but yesterday none of our sites would load. After contacting the hosting company they said that
In this instance, it would appear that your site is hanging waiting for a response from an external component / website (looking at your code, I see references to Facebook, Google and online apps at a glance).
We've never had this problem before and the sites have been using the social plugins for months so I wondered if anybody else experienced this problem yesterday?
Thanks
Without knowing the details, I can tell you that when you load Javascript via <script></script> tags the browser waits for the request to finish before moving on. So, if you load JS files in the <head></head> section, nothing will render on-screen until they load successfully. 404s here will kill you, as will slow connections.
If you have in-line JS, it's best to put it at the very bottom of your <body></body> section so that it does not interrupt browser rendering. Do this for JS files in your <head></head> section if your code architecture allows it.
Just a random stab in the dark but I wonder if this was related in any way to the anonymous threats on Facebook. I have experienced similar problems before with a twitter plugin, it could take a few seconds for the plugin to ping back the tweets and the rest of the page was hanging while twitter was doing their thing.
To ensure this doesn't happen again you could call your social plugin's on page load if its a viable solution. At the end of the day do you want your page load time to depend on other services outside of your control or not?
edit: #Jason McClellan 's answer is spot on. I believe the combo of out answers sort out your question. I Never include script in the head if I don't control the resource! That can indefinitely hang your page.
edit2: Don't know why I'm getting down rated for this answer was just trying to relate it to an experience of mine. I'm not extremely experienced with Facebook plugins but when I was noticing an issue with my twitter plugin, I did exactly what #Jason McClellan has said with his answer to make sure that all your html gets rendered before the scripts are even requested, then ensured that the page could display correctly with or without the twitter content so the user experience was not dependant on an external resource. The twitter plugin I was using had an initiation function which I had to call to fire up the script. The simplest way to call this would be with:
<body onload='init()'>
if you have another script you are loading with the page you could use something along the lines of
window.onload = function() {
init();
}
from within your external script.
Sorry I can't give an answer specific to the Facebook plugin. Maybe someone with more experience in the Facebook plugins could elaborate in the comments :D
edit3: also this community based tool suggests that quiet a few people where seeing a bit of facebook downtime when you experienced it - Facebook status at DownRightNow
edit4: I dont have the "Privelages" to comment on Jason's q as im pretty new here ... but in answer to your comment there, the stuff in the head that you describe is what loads in the schema for facebook mark-up language (fbml) , so a mark-up language like html, but geared up for you to use the facebook api. So you can do stuff like:
<fb:comments xid="titans_comments" canpost="true" candelete="false" returnurl="http://apps.facebook.com/myapp/titans/">
<fb:title>Talk about the Titans</fb:title>
</fb:comments>
To wack some comments straight into your page - fbml developers guide
Your comment there also suggests you are not including a script for your page to load so it renders our previous answers not as important, you need to include the fbml stuff in the head before you use fbml in your page so don't move it :D
but #jason McClellan's answer is something that everyone should do on their project (unless there is a reason not to) as it allows for the user to see something before the browser fires off requests for the scripts. At the end of the day we're in this game to make pretty stuff for our users!
Good luck