Different Google Index Information - google-search-console

More or less three month ago, I launched my own website. On the first day, I also verified my website for the Google Webmaster Tools, combined them with the Google Analytics Account and submitted a sitemap index file linked to five sitemap files.
But till now, I receive different Google Index Status information:
In Webmaster Tools:
Menu: Crawl -> Sitemaps: 123,861 Urls submitted, 64,313 Urls indexed
Menu: Google Index -> Index Status: 65,375 Urls indexed
When I type in google.de: “site:www.mysite.de”, then I receive 103,000 results.
When I check my website with push2check.net, I receive 110,000 Urls in Google Index.
What is wrong there? I understand that’s impossible for Google to deliver the accurate date because of the distributed processing and the result also depends on the location, where you searching from and so on. But between 65,000 and 110,000 is a huge gap. What’s the reason?
Thanks in advance!
Toby

google.de: “site:www.mysite.de”
You can search this type then google view your site all pages display index by Google.
And
Only Search
push2check.net
Then google View all result display when your website link have.
Then both result is different.

Related

English Literature site is not indexing new pages in google

This is my site link:
www.englishact.com
This is the sitemap current position:
Google is showing no error in sitemap or any other pages. But indexed pages are 0 for about 3 months. I also have uploaded new sitemaps which are acting same way with no index.
NB:
I am using 1and1 paid hosting package. Also, google has accepted adsence for this site. Now what can I do? Any suggestions?
Your website is index on Google, i just searched for site: www.englishact.com and got many results.
Check if the links in your XML sitemap are valid or redirecting to another URL.
Also you have to solve the duplication in the URLs, you can access your website with WWW and without it, also you have two URLs for the homepage http://englishact.com/ and http://www.englishact.com/index.php
After fixing these errors your website should be healthy and Google will understand the structure of it.

How to track direct URL referrer

Most hosts come with softwares or google analytics which allows you to know how a person got to your site, for example: a link on yelp.com or a facebook.com page link.
But it is impossible for the software to know what method of marketing got a person to visit your site if he directly typed the url (those analytic softwares show them as "direct url").
I need a creative idea where I can refine this broad term "direct url".
One way would be to use flyers with QR codes (links to the website) but instead of the website itself I direct them to clickemart.ca/flyer1_referrer which in turn sends him to the website clickemart.ca but a different flyer distributed to a different location would have a QR code which can be scanned to direct a mobile user to clickemart.ca/flyer2_referrer
So my question is, is this possible, will I be able to figure out which flyer 1 or 2 was more effective based on visits to the redirect urls? If it is possible can you give me a brief idea on how to implement it?
I know a lot of you will say add a form field with "source of referrer" on the site but from my experience this is never filled or filled with incorrect values (typically the closest to the mouse pointer or the top most option or "other" - you get it, something useless).
Any help or guidance is really appreciated!
Solved it here is how:
Connect your site to Google Analytics (www.google.ca/analytics)
Create a campaign URL (support.google.com/analytics/answer/1033867?hl=en)
OPTIONAL: shorten the URL (www.bitly.com)
Create a QR code to the URL (www.qrstuff.com/)
Scan the QR code and watch it appear as a campaign referral under Google Analytics
Here is an explanation to the different steps:
Once you create an account on Google Analytics, you can connect the site by copying a PHP script or a JavaScript onto every page or in my case (magento) the user code can simply be connected through the configuration settings
The campaign URL adds information to the URL link of your website just like the current link on stackoverflow.com variables such as noredirect="..." contains info for your server to process, so using the campaign URL could be used as a tag to determine the source of the referral
Shortening the URL is recommended because the QR code becomes less dense and this in turn reduces the chance of error while scanning the code, the shortened URL links directly to the campaign URL link you provided so it is a seamless process
QRStuff is a good place to download the QR code image at a high resolution
So when you scan this is what happens:
CODE SCANNED >> PHONE COMMAND TO GO TO SHORTENED URL LINK >> REDIRECT TO CAMPAIGN URL >> GOOGLE ANALYTICS RECEIVES INFO ABOUT CAMPAIGN REFERRAL >> YOU CAN SEE IT BY LOGIN INTO GOOGLE ANALYTICS

Octopress, github pages, CNAME domain and google website search

My blog was successfully transferred to octopress and github-pages. My problem though is that website's search uses google search but the result of 'search' as you can see, are pointing to the old (wordpress) links. Now these links have change structure, following default octopress structure.
I don't understand why this is happening. Is it possible for google to have stored in it's DB the old links (my blog was 1st page for some searches, but gathered just 3.000 hits / month... not much by internet's standards) and this will change with time, or is it something I'm able to change somehow?
thanks.
1.You can wait for Google to crawl and re-index your
pages, or you can use the URL Removal Request tool
to expedite removal of old pages from the index.
http://www.google.com/support/webmasters/bin/answer.py?answer=61062
According to that page, the removal process
"usually takes 3-5 business days."
Consider submitting a Sitemap:
http://www.google.com/support/webmasters/bin/answer.py?answer=40318
click here to resubmit your sitemap.
More information about Sitemaps:
http://www.google.com/support/webmasters/bin/answer.py?answer=34575
http://www.google.com/support/webmasters/bin/topic.py?topic=8467
http://www.google.com/support/webmasters/bin/topic.py?topic=8477
https://www.google.com/webmasters/tools/docs/en/protocol.html
2.Perhaps your company might consider the
Google Mini? You could set up the Mini to
crawl the site every night or even 'continuously'.
http://www.google.com/enterprise/mini/
According to the US pricing page,
the Mini currently starts at $1995 for a
50,000-document license with a year of support.
Here is the Google Mini discussion group:
http://groups.google.com/group/Google-Mini
http://www.google.com/enterprise/hosted_vs_appliance.html
(Click: "show all descriptions")
http://www.google.com/support/mini/
(Google Mini detailed FAQ)

Rich Snippet not showing in Google Search result page

About a month ago we implemented Rich Snippets on the product detail pages for our e-commerce site (example).
We used the http://schema.org/ syntax for the structured data, as it seems to be the route Google are taking moving forward.
The data appears to be correct in the Rich Snippet Testing Tool and the data has started to appear in Google Webmaster Tools.
However the data is still to be seen on the SERP.
We have followed the rich data guide on Google to the letter and still no results. Is this a case of just waiting?
Here is an additional piece of information that is making it all the more puzzling, we initially went with a Microformats implementation and within 24 hours the data started showing up on the SERP. However we moved away from this because the Schema.org approach seemed a better bet.
I suppose it is one of the reasons explained in my Wiki post at
http://wiki.goodrelations-vocabulary.org/FFAQ#Why_is_Google_not_showing_rich_snippets_for_my_pages.3F
While that one refers to GoodRelations markup, the situation should be the same for schema.org.
Martin
Quote:
If you have added GoodRelations (manually or via a shop extension module) to your shop and still do not get rich snippets in Google search results, this can have one of the following reasons:
Google has not yet re-crawled your page or pages. Google dedicates just a limited amount of crawling time to a site, depending on its global relevance. It may be that Google has simply not yet re-indexed your page. Wait 2 - 8 weeks ;-)
The markup is invalid. Try the Google Validator. If that shows a rich snippet in the preview, you may just have to wait 4 - 12 weeks until Google will notice and white-list your pages. If it does not show a rich snippet, you either do not have valid GoodRelations markup in the page, you are missing properties that Google requires (e.g. gr:validThrough for prices), the price of the item has expired, or you use markup for which Google does not show rich snippets. Currently, Google shows snippets only for products and offers.
Google cannot see that your page changed. Your XML sitemap (http://example.com/sitemap.xml or similar) does not contain a lastmod attribute or the lastmod attribute was not updated after you added GoodRelations/schema.org. This attribute is important for crawlers to notice which pages need to be reindexed.
Low ranking of your item pages. Your item pages have a low ranking and what you see in your Google results are category pages or other pages summarizing multiple items. GoodRelations shop extensions add markup only to the "deep" item pages, because those are best for rich snippets. Use the title / product name of one of your products and restrict the Google search to your site with the additional statement site:www.example.com.

Removing crawing from search engine on my login page

I have a login page (login.aspx) that is currently indexed in google when somebody does a search.
I have created a robots.txt file with the following:
User-agent: *
Disallow: /login.aspx
My question is how long will it take effect to where my login.aspx page will no longer be indexed by google. Is there anything else necessary to tell Google not to index my login page?
It could take up to 90 days before the index is removed from google database but realistic a week or two to update. You could also ask google to remove that page on Webmaster Tools but will work the same way as the crawler.
You might also want to log in to Google Webmaster tools and use the "Remove URL" feature from Site Configuration/crawler access and also increase the crawling speed from Site Configuration/Settings . This might help accelerate the removal of the URL.