prefixing github page with www return 404 - github

I have a Github page set up and I noticed when I try to open it with 'www' added, I am getting 404ed, I don't understand what may be the cause for it.
Its the case for any github page, I am using a public page as an example here:
https://square.github.io/ works flawlessly whereas https://www.square.github.io/ seems to return a security warning followed by a 404
I am curious to understand what might be causing this problem.
PS- I don't have very good understanding of Publishing websites or HTML, I am expecting it may as well be a very silly doubt from my end

Related

Linking github page to a google domain

I've perused SO for a bit, but i must be doing something wrong because i still can't get it to work.
I'm trying to link a domain name i got on Google to the index.html in my github repository.
I followed the steps here https://dev.to/trentyang/how-to-setup-google-domain-for-github-pages-1p58
but still keep getting an error:
when running the script, the response i get is
i'm not sure what this ghs.googlehosted.com is, when it's supposed to be solarmew.github.io. and why there's an extra third one too.

One of my Pages redirects me to AdminLTE GitHub

Anyone here familiar with AdminLTE? Not sure what is wrong but one of my pages redirects me to https://github.com/ColorlibHQ/AdminLTE. It was working for a while but now it's giving me this bug even though I did not do anything with the code.

Facebook Lint/Debugger 403 and 503 Response code. (Wordpress site.)

Humbly asking for any assistance people have time to give me on this one. Let me start by saying that I am aware there are previous questions about this on this site and elsewhere on the web; I have read a lot of them, and they are either unanswered/resolved, had a particular cause that doesn't apply to me, or suggests things I have already done.
Over the past few days, Facebook has suddenly stopped scraping my website posts successfully, so when I paste a link into Facebook it pulls nothing through - no thumb or description. I run the links through the FB lint/debugger, and it alternates between 403 and 503 response codes, but mainly 403. Previous links that Facebook has cached/successfully scraped still display with thumbs and desc, but still present as a 403 or 503 response.
My site is http://21stcenturyburlesque.com
One of the new URLs I have been testing is : http://21stcenturyburlesque.com/the-burlesque-top-50-2013/
I have checked with the server/host people. Nothing has changed, everything fine.
I have tried with the default wordpress theme. No change.
I have read threads about Bullet Proof Security causing issues, although why it suddenly would I don't know. It was deactivated on my site anyway, but I went through the removal process to remove the htaccess file with the BPS code in it. I have then run debug without an htaccess file present, and with a very basic htaccess present. No change.
Hotlinking protection is disabled in my cpanel.
I have experimented with adding/removing www. and / when I paste the link into lint as someone suggested. No change.
I use Facebook OGP Wordpress plugin. I spoke to the creator and he says the plugin is working as it should and to contact my host/server. See bullet one.
I tried creating a new FB App and using the new App Id number with the OGP plugin. No change.
Checked the cpanel error log. This came up three times tonight:
[Fri Nov 01 21:47:53 2013] [error] [client 193.242.149.35] File does not exist: /home/**/public_html/403.shtml
There are a few other things I ruled out but I've been at this for so long I can't remember all of them, so if someone suggests something else I've tried then I apologise for not mentioning it here in advance.
If anyone can suggest anything else, I would really appreciate it. I manage to fix most technical problems I come up against, but this has stumped me and my much more experienced colleague and it is really affecting my clickthrough rates and site traffic. If it comes down to adding things to my htaccess file, I would appreciate guidance on what to add/remove. Many thanks in advance.
I had the same problem. Drove me crazy for hours (maybe days). In your FB app settings make sure that the top Facebook url has http://

Error parsing input URL, no data was scraped. only with new pages on my site

The problem i have is that i own a website where other people can post stuff ,creating new pages on my domain, but the problem that occured today is that all the new post pages created today are malfunctioning , sharing is not loading thumbnail picture and title and so on, but the weird this is that all the posts(new pages) created before today are all working fine
What caused an error to occur out of nowhere?
I also cannot debug any of the URL's of my website as the same error: Error parsing input URL, no data was scraped
The website im having problems with is here http://www.vabameedia.ee/vm/184/h%C3%A4da-ei-anna-h%C3%A4beneda.html
This is one of the sites where it says no error on page but facebook still cant reach it. http://www.vabameedia.ee/vm/178/craig-parks-%C3%BChek%C3%A4eline-krossisoitja.html
For people experiencing the same problem but for different causes, I discovered a few interesting things about how Facebook "scrapes" pages, checking the logs of the server while doing some trials.
First of all: if you never tried to share a page with FB, FB never tried to scrape it, and it will not try to do so if you only put the url in the Debug tool.
That's the first reason because you get the error: it just states that FB has no information on the page, you must "force" it to scrape the page.
The first time you try to share a page, FB scrapes it (asks your server the first 40k of the page and analyse the opengraph tags).
What can happen is that you do not see the image: Facebook Share Dialog does not display thumbnails one first load
The reason is that FB behind the scenes is still scraping your page and caching the image. The next time, in fact, you have also the image.
How to solve it? Pre caching: https://developers.facebook.com/docs/sharing/best-practices#precaching
or simply add
<meta property="og:image:width" content="450"/>
<meta property="og:image:height" content="298"/>
I was pulling my hair out trying to fix this issue. Hours and hours of troubleshooting to no avail. After speaking with one of our programmers about a topic unrelated I thought of something to try as a long shot.
Much to my surprise, it worked!!!
This is the reason behind the problem and my solution for it:
When you draft a post in WordPress it generates a link based on your article's title (unless you manually change it). The title of my article included special characters, however the auto-generated link didn't display these special characters, only hyphens to replace the spaces. Should be fine right? Wrong! Somewhere embedded in metadata and code in the WordPress platform are those special characters and they mess up the way Facebook pulls info from the article being linked to. This is a problem because certain special characters invalidate hyperlinks.
For example:
Article Title: R[eloaded]
Auto-generated hyperlink DISPLAYED in WordPress "Permalink" field: http://www.example.com/reloaded
Actual WordPress Auto-generated hyperlink: http://www.example.com/r[eloaded]
Those brackets will invalidate the link and Facebook will be unable to pull any information (ie pictures) from it.
Solution:
(1) Simply, manually change the WordPress hyperlink address to something that doesn't include any special characters (this will not change the title of your article).
(2) Click "Update" to change the post to include the new hyperlink.
(3) Click "Purge from Cache" in the WordPress window
(4) Refresh your Facebook browser window
(5) Paste the new hyperlink for your article
(6) Enjoy your Facebook post with a preview image and information
Sidenote: Don't pull your hair out over Facebook, it's not worth it. =)
If you're using Wordpress, edit the post in question to change the permalink (just alter it slightly), then update the post. Using the new permalink in the Facebook OG debugger should now work.
It's a weird fix, but I think it takes care of a problem caused by special characters being used in the title of a post, which is then used to make the permalink.
Its all about DNS issue, was having same issue and resolved it by updating domain name servers to actual name servers.
In my case my domain was pointed to ns1.websterz.net and ns2.websterz.net and on this server i had DNS redirect to my other server (where web site is hosted). I Just updated name servers of the domain to actual name servers where my web site is hosted on. This was account migration case i forgot to update name servers as of new server.
Everything works fine now.

Domain blocked and no data scraped

I recently purchased the domain www.iacro.dk from UnoEuro and installed WordPress planning to integrate blogging with Facebook. However, I cannot even get to share a link to the domain.
When I try to share any link on my timeline, it gives the error "The content you're trying to share includes a link that's been blocked for being spammy or unsafe: iacro.dk". Searching, I came across Sucuri SiteCheck which showed that McAfee TrustedSource had marked the site as having malicious content. Strange considering that I just bought it, it contains nothing but WordPress and I can't find any previous history of ownership. But I got McAfee to reclassify it and it now shows up green at SiteCheck. However, now a few days later, Facebook still blocks it. Clicking the "let us know" link in the FB block dialog got me to a "Blocked from Adding Content" form that I submitted, but this just triggered a confirmation mail stating that individual issues are not processed.
I then noticed the same behavior as here and here: When I type in any iacro.dk link on my Timeline it generates a blank preview with "(No Title)". It doesn't matter if it's the front page, a htm document or even an image - nothing is returned. So I tried the debugger which returns the very generic "Error Parsing URL: Error parsing input URL, no data was scraped.". Searching on this site, a lot of people suggest that missing "og:" tags might cause no scraping. I installed a WP plugin for that and verified tag generation, but nothing changed. And since FB can't even scrape plain htm / jpg from the domain, I assume tags can be ruled out.
Here someone suggests 301 Redirects being a problem, but I haven't set up redirection - I don't even have a .htaccess file.
So, my questions are: Is this all because of the domain being marked as "spammy"? If so, how can I get the FB ban lifted? However, I have seen examples of other "spammy" sites where the preview is being generated just fine, e.g. http://dagbok.nu described in this question. So if the blacklist is not the only problem, what else is wrong?
This is driving me nuts so thanks a lot in advance!
I don't know the details, but it is a problem that facebook has with web sites hosted on shared servers, i.e. the server hosting your web site also hosts a number of other web sites.