I'm getting a tad tired of Webmaster Tools and its inability to agree with itself.
I'm seeing a warning that there are 'Sever health issues' with my domain.
Checking this Google seems CONVINCED that my robots.txt file is blocking my sitemap, even though Webmaster Tools itself says that my robots.txt file is AOK (and it is).
Looking at my sitemap I'm getting an error: "URL restricted by robots.txt".
Ok then.
I've tried deleting the sitemap from Google and re-adding it, no luck. I've tried doing the same with the robots.txt file, no luck. It;s convinced that my robots file is blocking the sitemap.
The funniest thing is that the www. version of the same domain is reporting that everything is fine. It's the same website.
So is Webmaster Tools just fundamentally broken, and I should ignore it, or is it flagging something I should be delving deeper into? Is the domain likely to be punished if Google itself thinks it can't see the sitemap (even though it can)?
Domains live and die by Googles hands, so I feel a tad frustrated that it's so insolent.
Edited to add:
I've just tried an experiment. Completely removing my functional Allow: / robots file seemed to get it to work. The file's cache life was an hour, and it's checked it several times over the past few days. Definitely bugged.
Related
I have a strange robots.txt issue.
I update my robots file regularly without problem, until some days ago. Now, when I update the robots via a FTP folder it appears to update. If I view it from a browsers, I only see the old version. Even Google doesn't update it, after several days. From Google's search console I see this:
On the left you see what I see in my browser and what google sees. On the right you see the real file, like it appears in ftp. The last 5 lines are different.
If it matters, I use Google's DNS servers. When I change the DNS on my PC connection to my Internet Service Provider's DNS, I see the new robots file, but google still does not.
Had the same issue, as my robots.txt wouldn't show correctly to Google in days. It can take a while for Google to crawl your site and check your robots.txt - sometimes its done right away, sometimes it can take up to a few days. I would suggest you to wait for a while and check again. Maybe even try from a computer with a different network (or a mobile network).
I recommend that you try and upload (if you haven't already) your file to Google for faster crawling. Check out this link - https://support.google.com/webmasters/answer/6078399?hl=en
the problem was with the cloudflare cache.. Flushed it the problem was solved
Strange thing happened today. When I visited google webmaster tools I found out that the content keywords are only three instead of (a lot that are actually on my website) and they are:
1. agent
2. disallow
3. user
And when I click on which one of the words, I get "found on the most popular page /robots.txt". I haven't done anything to my website or webmaster tools for a couple of days now, so I have no idea what could cause that.
Anyone with the same problem and solution? What can i do in situation like that?
This sounds like a temp bug in Analytics. Wait some days (or a week) and check again this.
Usually it is good to ignore what google webmasters tools says about keywords.
Share your robots.txt file (I assume there is nothing wrong there)
My site eighttwentydesign is running Joomla 3.0. I have SEF URLs on, and have done for sometime without issue. But today when you go to the site, and click on anything, say portfolio you get the home page under the portfolio's URL, but if you add a leading slash at the end, the right article (portfolio) shows. Additionally, if you click on say "Web Design" it sends you to the Portfolio page. I might add this menu is a menu within Joomla - not be adding internal links manually
Doesn't work: http://www.eighttwentydesign.com/portfolio
Does work: http://www.eighttwentydesign.com/portfolio/
I have checked the .htaccess, and actually reverted it to the original with no luck, I have check Global Config but I can't see anything which may cause this. It was working nicely yesterday. I haven't adapted with any PHP source or anything in the past few weeks, the only notifiable thing I have done is yesterday enabling the Cache - have others experienced problems after doing this? I have disabled it under global config, with no avail.
Exact Joomla Version is 3.0.2 with very few plugins
I do have daily backups, but would rather a solution and be able to figure out a prevention from that, rather than just putting on a band aid.
I've search for a good couple of hours, and aside from just not being able to fix it, it appears no one else is experiencing this, so I am starting to think it may be a bug.
Just as I was about to post this I discovered my solution.
If you are having your SEF URLs display the wrong content then solve it by disabling the Cache plugin. You can do this by doing the following steps
Login to Joomla backend
Navigate to Extensions > Plugins
Go to "System Cache"
Disable system cache
I hope this helps someone in the future as I really struggled to find any answers on this.
As part of our site deployment process we typically have proofreaders go through content on site at various stages in the process. For editing that happens before it gets to the site there are plenty of tools for tracking edits, but once the content is up on the site, we do not have a good method for this. I was thinking there might be a plugin or program that would allow our editors to load the site and mark-up the content in page as they would a tracked changes word document.
It seems like the kind of thing that aught to exist, but so far I have not found any god options for Chrome or Firefox. Has anyone made use of a tool like this in the past that they might recommend.
There was a page with the link http://www.xyzabc.com/exampleone . Then i updated the link as
http://www.xyzabc.com/example-one & this link is working fine now. But in webmaster tool errors still the "exampleone" link is showing 404error even i clear the error list. I made the update before 20days. still webmaster tool showing error. Can someone help us to know where we went wrong.
Thanks in advance.
Most likely your site hasn't been re-indexed by Google since you've made the change. You can use "Mark as Fixed" feature in Webmaster Tools (Diagnostics/Crawl Errors), which will remove the error from the list (but it will reappear if the underlining issue remains on next reindexing).
Pretty much the only way to speed up reindexing by google bot is to use Fetch as Googlebot/Submit to Google Index feature (which is only available if your site is added/verfied in Webmaster Tools already).