Thumbnails won't show from specific server. Identical page's thumbs works from other server - facebook

I've placed two identical files on two different servers to test how the thumbnails work when sharing in FB. Other domain works, the other doesn't. FB debugger gives 503 and scraper says "Document returned no data".
Works:
https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fwww.muikkumedia.fi%2Ffacebook%2F
Doesn't work (other provider):
https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fartblog.pa-la.fi%2Ffacebook%2F
As there is dash (-) in other domain's (& server's) name, I was thinking it might be the "dash-bug", but debugger can find info from the root domain, so that doesn't seem to be the problem either. I'd give the link ready here too, but I don't have enough reputation for this system here.
Anyway, server provider's answer is: "I'd suggest you contact facebook's support on this issue, as everything looks ok on our end with the info they provide."
Any ideas?

Changed the provider, works now.

Related

Facebook doesn't pull image preview, title, etc. when sharing a link from domain

I'm having an issue that is boggling my mind for a few days now.
Sharing of the exact same content is not working on www.raptorsrepublic.com but fine on forums.raptorsrepublic.com (or any other domain).
I simply can't share (i.e., generate previews, title, etc.) links from www.raptorsrepublic.com, but when I post the exact same content from a different domain (as a test, forums.raptorsrepublic.com), it works.
Even the graph debug tool crashes on any link from www.raptorsrepublic.com, but works fine for forums.raptorsrepublic.com. Here's an example:
http://forums.raptorsrepublic.com/y.html
http://www.raptorsrepublic.com/y.html
Exact same content, but one works (forums.raptorsrepublic.com) and one doesn't (www.raptorsrepublic.com) and the latter even crashes Facebook's share debug tool.
Can you please help me in getting this working so I can share links from the domain www.raptorsrepublic.com?
Note that the domain is not blacklisted as I can actually post the link fine.

Unable to use port number in Facebook's "site url" setting

I have been creating a really small facebook app and I have just realized that the settings page in FB's dev section has changed. The options remained tho, so I went ahead and filled in the most important things (the required ones). When I got to these:
App domain
Site URL
Now for local testing I always fill localhost and that is ok. However, before, I could put the site url like this:
http://localhost:8872/myapp/
Now this doesn't work anymore, it simply refuses to take it and shows me: Something went wrong. We're working on getting it fixed as soon as we can. Hm.
If I change the site url to this:
http://localhost/myapp/
It is saved without problems. It makes my wonder - did Facebook change something and it will stay so, is it indeed their bug and will be fixed, or am I doing something wrong? Please note that it works alright without the port! However, I really need the port there. Thanks!

How to prevent Google from indexing redirect URL I do not own

A domainname that I do not own, is redirecting to my domain. I donĀ“t know who owns it and why it is redirecting to my domain.
This domain however is showing up in Googles search results. When doing a whois it also returns this message:
"Domain:http://[baddomain].com webserver returns 307 Temporary Redirect"
Since I do not own this domain I cannot set a 301 redirect, or disable it. When clicking the baddomain in Google it shows the content of my website but the baddomain.com stays visible in the URL bar.
My question is: How can I stop Google from indexing and showing this bad domain in the search results and only show my website instead?
Thanks.
Some thoughts:
You cannot directly stop Google from indexing other sites, but what you could do is add the cannonical tag to your pages so Google can see that the original content is located on your domain and not "bad domain".
For example check out : https://support.google.com/webmasters/answer/139394?hl=en
Other actions can be taken SEO wise if the 'baddomain' is outscoring you in the search rankings, because then it sounds like your site could use some optimizing.
The better your site and domain rank in the SERPs, the less likely it is that people will see the scraped content and 'baddomain'.
You could however also look at the referrer for the request and if it is 'bad domain' you should be able to do a redirect to your own domain, change content etc, because the code is being run from your own server.
But that might be more trouble than it's worth as you'd need to investigate how the 'baddomain' is doing things and code accordingly. (properly iframe or similar from what you describe, but that can still be circumvented using scripts).
Depending on what country you and 'baddomain' are located in, there are also legal actions. So called DMCA complaints. This however can also be quite a task, and well - it's often not worth it because a new domain will just pop up.

Domain blocked and no data scraped

I recently purchased the domain www.iacro.dk from UnoEuro and installed WordPress planning to integrate blogging with Facebook. However, I cannot even get to share a link to the domain.
When I try to share any link on my timeline, it gives the error "The content you're trying to share includes a link that's been blocked for being spammy or unsafe: iacro.dk". Searching, I came across Sucuri SiteCheck which showed that McAfee TrustedSource had marked the site as having malicious content. Strange considering that I just bought it, it contains nothing but WordPress and I can't find any previous history of ownership. But I got McAfee to reclassify it and it now shows up green at SiteCheck. However, now a few days later, Facebook still blocks it. Clicking the "let us know" link in the FB block dialog got me to a "Blocked from Adding Content" form that I submitted, but this just triggered a confirmation mail stating that individual issues are not processed.
I then noticed the same behavior as here and here: When I type in any iacro.dk link on my Timeline it generates a blank preview with "(No Title)". It doesn't matter if it's the front page, a htm document or even an image - nothing is returned. So I tried the debugger which returns the very generic "Error Parsing URL: Error parsing input URL, no data was scraped.". Searching on this site, a lot of people suggest that missing "og:" tags might cause no scraping. I installed a WP plugin for that and verified tag generation, but nothing changed. And since FB can't even scrape plain htm / jpg from the domain, I assume tags can be ruled out.
Here someone suggests 301 Redirects being a problem, but I haven't set up redirection - I don't even have a .htaccess file.
So, my questions are: Is this all because of the domain being marked as "spammy"? If so, how can I get the FB ban lifted? However, I have seen examples of other "spammy" sites where the preview is being generated just fine, e.g. http://dagbok.nu described in this question. So if the blacklist is not the only problem, what else is wrong?
This is driving me nuts so thanks a lot in advance!
I don't know the details, but it is a problem that facebook has with web sites hosted on shared servers, i.e. the server hosting your web site also hosts a number of other web sites.

Site URL blocked as "Spammy", but also no data was scraped in debug

Made a website for a client of mine who owns a small business. About three months ago, her site URL was blocked by Facebook for being "Spammy". We launched a pretty impressive "Go Here And Report It As Safe" campaign, but alas, it's not unblocked.
We made a new domain that mirrored the blocked one. This worked for about an hour. Then lo and behold! It got blocked too.
I was very curious, so I decided to try out the "Object Debugger". When I did, I got this message:
"Error Parsing URL: Error parsing input URL, no data was scraped."
I tried it again a few hours later, and it scraped just fine! Not only did it scrape and show up in debug perfectly, but it also didn't ping as blocked when I posted to my wall! It was amazing.
Sadly, I made an edit to the header file (just took away a meta tag), and now it won't scrape again. And it's blocked again.
The URL in question is enchantedcareers.com.
I feel like maybe the site isn't being blocked as spammy, but rather, there may be some kind of coding problem? Anyone else had an OG bug ping a URL as blocked upon link shim?
EDIT : Again, it let me post the link, with full preview and everything. I posted it, and about one minute later, the post was removed, and it was back to being "spammy"
The URL Debugger only scrapes my URL sporadically (with no page edits being made whatsoever).
I can't find a pattern.
no data was scraped: 9:06
successful scrape: 9:34
no data was scraped: 9:44
successful scrape: 10:04
no data was scraped: 10:08
Edit #2 : This is just completely crazy. Our new domain, enchantedcareers.net, which is nothing but my host's default quickstart.html page, is also blocked from being posted. When I try to post the .net domain, it gives me both the .net and .com domain as being blocked.
THE .COM DOMAIN ISN'T EVEN TIED TO THE .NET NAME. This domain is straight out of the box. Why is it bringing that domain up when I try to post a new one?
I'm just so confused.
It won't scrape the .net name, either.
Could this be a server thing...?
Your URI comes up clean on a URIBL check, you're on Dream Host, which is typically reliable, so I wouldn't expect to see your IP address show up in an DNSBL check. I don't see any glaring errors in your page code that typically causes the Facebook parser to choke.
There is one "suspicious" script on your page according to this report. Try removing this, clear your cache and see if Facebook will parse your URL.