Facebook graph api picture loading problem - facebook

I'm having this very unusual problem where on a page refresh all profile pictures that loaded on the previous page refresh no longer load. Instead each image laoded causes these errors in the console:
Resource interpreted as Image but transferred with MIME type text/javascript.
Failed to load resource: the server responded with a status of 400 (Bad Request)
Especially weirdly is, if I go on one page where I have say, 3 pictures A B and C. If on the next page I have 5, including A B and C from the last, only D and E will load. If I then refresh only A, B and C will load. This can be repeated forever!
Any ideas as to what on earth explains this?
PS. Not linking my app to start with as I don't believe it's necessary, but if needed I will do.

as loading profile pic via graph api is actually an in-direct way, problem will occur if facebook svr cannot handle your request (giving you the actualy pro file location).
I would suggest getting the static link to the user's profile pic via fql/graph api first and then use it in your app. This will actually boost up your app's performance too (displaying the profile pic via graph api is slow).

This is a temporary Facebook issue and has already been reported by at least two other users and logged with Facebook.

I've been seeing the same issue. And this is not the first time. My solution has been to cache the URL which the graph API redirects to. Never had problems with this URL (hosted on a pretty reliable CDN).

I've encountered this issue with alternating bad requests on profile pictures just now: at a closer inspection of the request header and answer headers, I've found the hint that Facebook doesn't allow P3P queries (not anymore? I thought it worked on my last app).
So, if you have somewhere a statement like
header('P3P: CP="NOI ADM DEV PSAi COM NAV OUR OTRo STP IND DEM"');
just remove it, and perhaps after clearing the browser cache and/or web framework cache, then it should work (if this was indeed the problem, and not coincidentally some Facebook bug).

Related

Where, exactly channel url is used?

On what browsers or user agents that channel URL is actually used, and what for?
I have no intention of having my site to work on Internet Explorer <= 8 (it is an HTML5 <canvas> game, and I am serving everything else as "application/xhtml+xml").
So, if channel is only useful on that old crap, I can gladly get rid of it...
Related (possibly): Channel URL Facebook
Because the social plugin is cross domain call, it needs a way to communicate. The wrokaround is to include a hidden iframe in the page for that. But, with this workaround, that iframe is loaded every time when page loads and will double the traffic reported. This is why channel url was done. What it does, it load the fb js in that page, and from that moment on, the js is available on your domain.
It will improve your loading times (cache) and will fix the reporting issue (you will see in reports channel page reported separately). But is not necessary for any html5 capable browser.
So, if you are using only HTML5 capable browsers, you are safe to ignore that. I am not sure about ie9, I will try to test it with my app by removing channel url and let you know.
Edit: By removing the channel URL from my app, I start getting double traffic reports from IE9. I think that is a good idea to keep the file there, is is just a simple html file with a single line. Better to be safe than sorry.

Domain blocked and no data scraped

I recently purchased the domain www.iacro.dk from UnoEuro and installed WordPress planning to integrate blogging with Facebook. However, I cannot even get to share a link to the domain.
When I try to share any link on my timeline, it gives the error "The content you're trying to share includes a link that's been blocked for being spammy or unsafe: iacro.dk". Searching, I came across Sucuri SiteCheck which showed that McAfee TrustedSource had marked the site as having malicious content. Strange considering that I just bought it, it contains nothing but WordPress and I can't find any previous history of ownership. But I got McAfee to reclassify it and it now shows up green at SiteCheck. However, now a few days later, Facebook still blocks it. Clicking the "let us know" link in the FB block dialog got me to a "Blocked from Adding Content" form that I submitted, but this just triggered a confirmation mail stating that individual issues are not processed.
I then noticed the same behavior as here and here: When I type in any iacro.dk link on my Timeline it generates a blank preview with "(No Title)". It doesn't matter if it's the front page, a htm document or even an image - nothing is returned. So I tried the debugger which returns the very generic "Error Parsing URL: Error parsing input URL, no data was scraped.". Searching on this site, a lot of people suggest that missing "og:" tags might cause no scraping. I installed a WP plugin for that and verified tag generation, but nothing changed. And since FB can't even scrape plain htm / jpg from the domain, I assume tags can be ruled out.
Here someone suggests 301 Redirects being a problem, but I haven't set up redirection - I don't even have a .htaccess file.
So, my questions are: Is this all because of the domain being marked as "spammy"? If so, how can I get the FB ban lifted? However, I have seen examples of other "spammy" sites where the preview is being generated just fine, e.g. http://dagbok.nu described in this question. So if the blacklist is not the only problem, what else is wrong?
This is driving me nuts so thanks a lot in advance!
I don't know the details, but it is a problem that facebook has with web sites hosted on shared servers, i.e. the server hosting your web site also hosts a number of other web sites.

URLs redirect to spyware site

We are developing an app that makes posts on behalf of our users to Facebook. Within those posts, we want to put links to external (non-Facebook) websites.
Looking at the links in the status bar of the browser (usually Chrome), the correct URL is displayed. However, Facebook seems to wrap the actually-clicked link into some extra bells-and-whistles. Usually, this works correctly.
Sometimes, however, this URL wrapping ends up sending the click to a URL like:
http: //spywaresite.info/0/go.php?sid=2
(added space to make it non-browsable!) which generates Chromes severe warning message:
This happens very occasionally on Chrome, but very much more often in the iOS browser on the iPhone.
Does anyone have any pointers as to how to deal with this?
EDIT
For example, the URLs we put in the link is
http://www.example.com/some/full/path/somewhere
but the URL that actually gets clicked is:
http://platform.ak.fbcdn.net/www/app_full_proxy.php?app=374274329267054&v=1&size=z&cksum=fc1c17ed464a92bc53caae79e5413481&src=http%3A%2F%2Fwww.example.com%2Fsome%2Ffull%2Fpath%2Fsomewhere
There seems to be some JavaScript goodness in the page that unscrambles that and usually redirects correctly.
EDIT2
The links above are put on the image and the blue text to the right of the image in the screenshot below.
Mousing over the links (or the image) in the browser shows the correct link. Right-clicking on the link and selecting "Copy Link Address" gets the fbcdn.net link above (or one like it). Actually clicking on the link seems to set off some JavaScript processing of the fbcdn.net link into the right one... but sometimes that processing fails.
I'm not 100% sure what you're asking here, but i'll tell you what I know:- are you referring to this screen on Facebook?
(or rather, the variation of that screen which doesn't allow clickthrough?)
If you manually send a user to facebook.com/l.php?u=something they'll always see that message - it's a measure to prevent an open redirector
if your users are submitting such links, including the l.php link, you'll need to extract the destination URL (in the 'u' parameter)
If you're seeing the l.php URLs come back from the API this is probably a bug.
If links clicked on facebook.com end up on the screen it's because facebook have detected the link as suspicious (e.g. for URL redirector sites - the screen will allow clickthrough but warn the user first) or malicious/spammy (will not allow clickthrough)
In your app you won't be able to post links to the latter (an error will come back saying the URL is blocked), and the former may throw a captcha sometimes (if you're using the Feed dialog, this should be transparent to the app code, the user will enter the captcha and the dialog will return as normal)
If this isn't exactly what you were asking about please clarify and i'll update my answer
Rather than add to the question, I thought I'd put more details here.
It looks like the Facebook mention in the original title was mis-directed, so I've removed it.
We still haven't got to the bottom of the issue.
However, we used both Wireshark and Fiddler to look at the HTTP traffic between the Chrome browser (on the PC) and Facebook. Both showed that Facebook was returning the correct URL refresh.
Here's what Wireshark showed:
What we saw on Fiddler was that our server is issuing a redirect to the spywaresite.info site:
We are working with our ISP to figure out what is happening here.

Wordpress og:image shows up blank

I've been at this for almost 3 days straight and now I can't even think clearly anymore.
All I'm trying to do is to get my featured image thumbnail to appear when I paste the link in Facebook.
I'm using the Wordpress Facebook Open Graph protocol plugin which generates all the correct og meta properties.
My thumbnail images are 240x200px which respects the minimum requirements and also respects the 3:1 ratio
I've made sure there's no trailing slash at the end of my post URLs
When I use the Facebook Object Debugger, the only warning is in regards to my locale, but that shouldn't affect it.
Facebook appears to be pulling the right image, at least the URL is correct, but the image appears as a blank square
I've gone through pretty much every thread I could find in forums, but all the information available is about using the correct og tags, which I believe I'm already doing.
Thank you very very much for any help, I'm desperate!! :)
You can troubleshoot the OpenGraph meta tags with the Debugger https://developers.facebook.com/tools/debug - this can at least show if you're using the meta tags properly and if Facebook can 'read' the image.
I finally figured out that the root of my issue was the fact that I was using an addon domain (which is really a subdomain being redirected to the top level domain) and I read on eHow (of all places :) ) that Facebook has trouble pulling data from redirected domains.
Not sure if there was another way around it, but I simply ended up creating a seperate hosting account and everything is loading properly now.
one problem youre going to run into testing is that often the first time your page or post gets liked, fb keeps whatever img it finds in your meta tags or by searching your page. so, you'll keep changing your img meta tag and still it wont show the right pic. it's very anoying. One way to get around it is to change the slug of your post. now, it has a different url and to fb, it's a different page. The downside is you lose all the likes that go with your orig url. Not a problem with a new site.
I ended here googling another problem. Maybe this might help someone:
Please bear in mind that the facebook scraper works asynchronously and will need some time (during my tests around 10 minutes) to be able to display an image after seeing it for the first time.
For more information, here's a more thorough answer on a similar problem.
Indeed, as Andy Wibbels points out the FB debugger is a really handy tool.
I faced a similar issue with a server's og:image tag pointing to a secure subdomain which actually mirrors a CDN server,
<meta property="og:image" content="https://subdomain.pathToImage.jpg" />
<meta property="og:image_secure" content="https://subdomain.pathToImage.jpg" />
The FB debugging tool allows you to see the errors that FB encounters when trying to pull the image.
In my case the subdomain was not registered under the SSL certificate used by the HTTPS protocol. Hence FB was getting the following error,
Curl Error : SSL_CACERT SSL certificate problem: unable to get local issuer certificate

Creating a new facebook application, no signed request is being sent through. Old app works

I have just created a new facebook application to use in a page tab. When I check for the signed_request it is empty.
So I checked another app that I have done in the past. It is on the same server and the only difference in the setup is the dir name.
http://example.com/app1/ works
and
http://example.com/app2/ doesn't
So I tried swapping the tab URLs in the apps over and app1 still works with app2s URL.
This lead me to believe there must be a problem with the way I have set up the application.
So I went through all the settings and made sure they were the same. They all are, except there are 2 that are not available on the app I just setup. They are "Encrypted Access Token" and "Requests 2.0 efficient" neither of which seem to relate to the problem.
I seem to remember in the past there was an option to pass through the signed request to canvas page, but I can no longer find it.
Has signed request been deprecated? I couldn't see any mention of that in the docs.
Any help/comment appreciated. Otherwise I am going to have to go back and re-use old apps with new content in them, and I only have about 15 more before I run out.
Cheers
Alex
No, it wasn't deprecated.
There can be 2 issues that could cause this
You must have a trailing slash or a specific file on your page tab url.
http://myapp.com/app/
http://myapp.com/app/index.php
Your server for the second app is redirecting the request which causes it to lose the POST variables.
You should check if there aren't any redirects which usually occur form mod_rewrite. (.htaccess and such)
This seems to have been resolved now, I can no longer replicate it with new applications.