Strange Robots.txt update problems - robots.txt

I have a strange robots.txt issue.
I update my robots file regularly without problem, until some days ago. Now, when I update the robots via a FTP folder it appears to update. If I view it from a browsers, I only see the old version. Even Google doesn't update it, after several days. From Google's search console I see this:
On the left you see what I see in my browser and what google sees. On the right you see the real file, like it appears in ftp. The last 5 lines are different.
If it matters, I use Google's DNS servers. When I change the DNS on my PC connection to my Internet Service Provider's DNS, I see the new robots file, but google still does not.

Had the same issue, as my robots.txt wouldn't show correctly to Google in days. It can take a while for Google to crawl your site and check your robots.txt - sometimes its done right away, sometimes it can take up to a few days. I would suggest you to wait for a while and check again. Maybe even try from a computer with a different network (or a mobile network).
I recommend that you try and upload (if you haven't already) your file to Google for faster crawling. Check out this link - https://support.google.com/webmasters/answer/6078399?hl=en

the problem was with the cloudflare cache.. Flushed it the problem was solved

Related

Can’t submit file for Facebook App Approval

I’ll set my outrage with the way this process works (to whom can I speak?) aside for the moment: we are attempting to provide FB with a link to our ~200 mb app for approval. We have been rejected 3 times because they are incapable of extracting our zip file (they request a zip for some unknown reason — it has minimal size impact).
Some detail: we are linking to the zip on our Dropbox. We have removed all punctuation from our app title (Pandamonium!.app becomes Pandamonium.app). We have eliminated spaces from our source folder. I thought all these could be causing a problem with iOS-sim.
I’m not sure what is left to do, but I am hoping someone can present a clear set of instructions (NOT THEIR INSTRUCTIONS, WHICH I HAVE READ) they have followed particularly if you have met similar snags or ANY ideas for resolution. All they send me is useless screenshots of their simulator unable to open the app which I have simulated and opened successfully daily with iOS-sim for the last week.
After a great deal of trial and error I found that using Facebook's command-line instructions was what was causing the issue. You should just compress your .app file in an ordinary fashion (right click and compress -- I used a Windows computer just to make sure everything was copasetic after reading about bizarre Mac .cbgz compression issues).
Regardless, in summary, I can now see why no one else has had an issue with this: it's because no one reads their instructions and rather just creates their .zip files in the ordinary way; unsurprisingly, you're better off using your common sense rather than listening to others.
Aside: ironically, after being told my use case was fine and the only issue was not being able to unzip, Facebook (India) has now told me they couldn't find my login button (which is gigantic, in multiple places, and clearly described in my instructions). This process is an absolute joke. I wish anyone going through this hell good luck.

51 Degrees Mobi keeps doing unwanted redirects. How to stop?

I just recently installed the 51 Degrees Mobi Nuget packages to Visual Studio 2015. I have come across an annoying feature that I am not able to figure out how to turn off. When I browse to my site with a mobile device, it automatically redirects the device to /Mobile/Default.aspx.
I don't want this. I want the device to go to the requested URL. Per the documentation found Here, it says
This element determines how mobile devices should be redirected. If it is omitted, redirection will be disabled.
This is simply not working as described. I have removed that element in the 51Degrees.config file along with its SectionGroup, yet this unwanted redirection continues. How do I actually disable this annoying feature?
Thanks
Ok, a little less frustrated now and can think a bit clearly. I read in another post about a cookie being set with a redirect value. After I cleared the cookies on the clients, this problem went away.

Webmaster Tools and Google differ on "robots.txt"

I'm getting a tad tired of Webmaster Tools and its inability to agree with itself.
I'm seeing a warning that there are 'Sever health issues' with my domain.
Checking this Google seems CONVINCED that my robots.txt file is blocking my sitemap, even though Webmaster Tools itself says that my robots.txt file is AOK (and it is).
Looking at my sitemap I'm getting an error: "URL restricted by robots.txt".
Ok then.
I've tried deleting the sitemap from Google and re-adding it, no luck. I've tried doing the same with the robots.txt file, no luck. It;s convinced that my robots file is blocking the sitemap.
The funniest thing is that the www. version of the same domain is reporting that everything is fine. It's the same website.
So is Webmaster Tools just fundamentally broken, and I should ignore it, or is it flagging something I should be delving deeper into? Is the domain likely to be punished if Google itself thinks it can't see the sitemap (even though it can)?
Domains live and die by Googles hands, so I feel a tad frustrated that it's so insolent.
Edited to add:
I've just tried an experiment. Completely removing my functional Allow: / robots file seemed to get it to work. The file's cache life was an hour, and it's checked it several times over the past few days. Definitely bugged.

google webmaster tools content keywords come from robots.txt

Strange thing happened today. When I visited google webmaster tools I found out that the content keywords are only three instead of (a lot that are actually on my website) and they are:
1. agent
2. disallow
3. user
And when I click on which one of the words, I get "found on the most popular page /robots.txt". I haven't done anything to my website or webmaster tools for a couple of days now, so I have no idea what could cause that.
Anyone with the same problem and solution? What can i do in situation like that?
This sounds like a temp bug in Analytics. Wait some days (or a week) and check again this.
Usually it is good to ignore what google webmasters tools says about keywords.
Share your robots.txt file (I assume there is nothing wrong there)

Joomla 3.0 SEF URLs sending to random wrong articles

My site eighttwentydesign is running Joomla 3.0. I have SEF URLs on, and have done for sometime without issue. But today when you go to the site, and click on anything, say portfolio you get the home page under the portfolio's URL, but if you add a leading slash at the end, the right article (portfolio) shows. Additionally, if you click on say "Web Design" it sends you to the Portfolio page. I might add this menu is a menu within Joomla - not be adding internal links manually
Doesn't work: http://www.eighttwentydesign.com/portfolio
Does work: http://www.eighttwentydesign.com/portfolio/
I have checked the .htaccess, and actually reverted it to the original with no luck, I have check Global Config but I can't see anything which may cause this. It was working nicely yesterday. I haven't adapted with any PHP source or anything in the past few weeks, the only notifiable thing I have done is yesterday enabling the Cache - have others experienced problems after doing this? I have disabled it under global config, with no avail.
Exact Joomla Version is 3.0.2 with very few plugins
I do have daily backups, but would rather a solution and be able to figure out a prevention from that, rather than just putting on a band aid.
I've search for a good couple of hours, and aside from just not being able to fix it, it appears no one else is experiencing this, so I am starting to think it may be a bug.
Just as I was about to post this I discovered my solution.
If you are having your SEF URLs display the wrong content then solve it by disabling the Cache plugin. You can do this by doing the following steps
Login to Joomla backend
Navigate to Extensions > Plugins
Go to "System Cache"
Disable system cache
I hope this helps someone in the future as I really struggled to find any answers on this.