Google Sitemap: Errors - google-search-console

I have just submitted my sitemap to google and it is giving below kind of error.
Errors are
Rating is missing required best and/or worst values
Value in property "reviewCount" must be positive
Am I missing something in sitemap.xml file?
How to fix above errors?
If I didn't fix above error it will affect my website ranking or not?

Related

Can anyone help, why google page speed said "Client site redirect" and my domain has a google selected chronical "http://ww82.gooogleapi.com"?

Search Console Error
Google Page Speed Error
My domain www.tanviralamhira.com is not indexing and also I cannot even check google page speed. problems are shown in the screenshot. Can anybody help, how can I solve it?
I have tried several time to check and index request but it failed.
Had the same issue!
The problem was in the string
script src='https://ajax.gooogleapi.com/ajax/libs/jquery/1.7.35/jquery.min.js
Remove this string or change a link to jquery library and check wether redirect disappears. Interestingly, the link looks legit and thus doesnt warn one at all, nevertheless...
Good luck!

TYPO3 News Extension Pagionation doesn't work

I use the news extension in TYPO3 and so far it's almost fine.
The only problem I have is the pagination. If I click e.g. on the pagination page 2, the link get some extra string like &tx_news_pi1[%40widget_0][currentPage]=2 but the outcome is always the same like on the first page. So do I have to do something in the settings to get this working or does anyone of you have an idea where the mistake could be?
I looked up at github and google but couldn't find any helpful information for my problem which is kind of unbelievable...
Another thing is I would try to do an AJAX pagination when it's working. I saw the topic on the TYPO3 site (https://docs.typo3.org/typo3cms/extensions/news/AdministratorManual/Templates/AjaxBasedPagination/Index.html) and followed the instructions of installing typoscript_rendering, inserting the list plugin as content element, changed the template path and inserted the js.file. But the thing here is when I set the template path to the indexAjax.html I get the following error
Undeclared arguments passed to ViewHelper GeorgRinger\\News\\ViewHelpers\\Widget\\Ajax\\PaginateAdditionalParamsViewHelper: page
When I look in the AjaxPagination.js given by link above the variable ajaxUrl is always undefined, because it wants to grab the data which causes the error message above I think. Hopefully somebody of you can hint me to the right solution..

Googlebot guesses urls. How to avoid/handle this crawling

Googlebot is crawling our site. Based on our URL structure it is guessing new possible URLs.
Our structure is of the kind /x/y/z/param1.value. Now google bot exchanges the values of x,y,z and value with tons of different keywords.
Problem is, that each call triggers a very expensive operation and it will return positive results only in very rare cases.
I tried to set an url parameter in the crawling section of the webmasters tools (param1. -> no crawling). But this seems not to work, probably cause of our inline url format (would it be better to use the html get format ?param1=..?)
As Disallow: */param1.* seems not to be an allowed robots.txt entry, is there another way to disallow google from crawling this sites?
As another solution I thought of detecting the googlebot and returning him a special page.
But I have heard that this will be punished by google.
Currently we always return a http status code 200 and a human readable page, which says: "No targets for your filter critera found". Would It help to return another status code?
Note: This is probably not a general answer!
Joachim was right. It turned out that the googlebot is not guessing URLs.
Doing a bit of research I found out that I added a new DIV in my site containing those special URLs half a year ago (which I unfortunately forgot). A week ago googlebot has started crawling it.
My solution: I deleted the DIV and also I return a 404 status code on those URLs. I think, sooner or later, googlebot will now stop crawling the URLs after revisiting my site.
Thanks for the help!

Integration of facebook opengraph to wordpress?

First let me explain my predicament,
I am currently developing a wordpress site which implements facebook apps for my post likes/comments. and one function of the site is that there is a tab/page where in the posts are sorted based on number of comments and facebook likes.
I have used multiple resources to try to achieve this but in the end i still havent gotten the feature i wanted. i used this http://www.flexostudio.com/flexo-facebook-manager.html and this http://www.metronet.no/sorting-posts-by-facebook-likes-in-wordpress/ as my reference.
Here is the problem i am facing, I already implemented the tutorials i saw, but then it still doesnt work, so it made me wonder. so i tried to trace the root of the problem, im not sure if this is the cause but one problem i saw is when i used FB's debug tool, i get these warnings
Open Graph Warnings That Should Be Fixed
Inferred Property: The 'og:url' property should be explicitly provided, even if a value can be inferred from other tags. Inferred Property: The 'og:title' property should be explicitly provided, even if a value can be inferred from other tags.
Im not sure if the problem im facing is due to mis configuration of facebook app or something, has someone experienced the same problem? Or does anyone have any other resources to be able to help me achieve the post sorting based on FB likes and comments?
Any type of help would be gladly appreciated,
Regards,
RJ
Why not just use an already built plugin for this, e.g. http://wordpress.org/extend/plugins/wordpress-connect/
if you insist on your own, see how it is done - http://plugins.svn.wordpress.org/wordpress-connect/tags/2.0.3/src/WordpressConnect.php

facebook like button not working incorrect meta data but data is correct

Im currently using the fb url linter and can see that everything is there correctly, however when a user likes the page it misses key information.
the linter is also stating there is missing content, however it also shows that all tags are there as well. is there anything wrong with my code ?
this is the result on the facebook url linter
http://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fdev.murchison-hume.com%2FProducts%2FSuperlative-Liquid-Hand-Soap%2FCoriander-Superlative-Liquid-Hand-Soap-Refill
it has conflicting messages stating things dont exist, when they obviously do as they are outlined below the error messages...
any help greatly appreciated
Recheck your type contents. The first error is telling you that you don't own the object type. For custom objects, usually you need to pass in
<meta property="og:type" content="MY_APP_NAMESPACE:product"/>
If that fails, og:type errors usually comes up when you fail to setup your objects correctly in your open graph settings (at developers.facebook.com).
I suspect that fixing the first error will make the others disappear.