Google Search impressions stopped updating - google-search-console

Search impressions went from 0 to 1350 within a week then suddenly stopped. Also I have requested indexing for the main website url and it has never been crawled by Google Bot. What could be the issue? I have added three properties: http://site-url.com , https://site-url.com and https://www.site-url.com but none is updating search impressions.
Both http://site-url.com and https://site-url.com have been indexed by google crawl. However
https://www.site-url.com has not been indexed but I have requested indexing(1 month ago);I have also requested a few more times.
All sitemaps are added and accessible to search console.
What could be the cause of the issue?
I did robots.txt tests and none of the crawlers is blocked. see image below

Related

GA landing pages (not set), but URI is known AND organic traffic down, direct/none up?

I need help troubleshooting 2 main issues with our Google Analytics data. Both started occurring around May 5, 2020. I've worked through troubleshooting recommendations in few blog posts but have had no luck. Can anyone point me in the right direction for how to troubleshoot these issues?
Organic traffic has dropped considerably on our /blog/ pages while direct/none traffic has increased. When I check Google Search Console's organic search data, I see numbers that reflect the organic + direct/none traffic in google analytics.
When I look at the landing page report, there is a huge increase in (not set) landing pages on our blog. I saw 14,000% and 26,000% increases... Our overall landing page traffic is down by 15%. Weirdly, the URI is known, but the landing page is (not set)...?
Please check out this video to see the data in GA - http://m.bixel1.net/jxe9ei
One potential cause is that we have a homepage redirect for anyone using chrome. The redirect goes from / to /c/ for anyone on chrome and is hard coded. We've been testing this since the beginning of the year and we switched the test to serve to 100% of chrome visitors on March 26, 2020. Could this possibly be causing our traffic issues?
The redirect certainly creates anomalies (consider that when landing on the page both / and /c/ are tracked).
I noticed, for example by accessing a blog page from google, that the pageview has no referrer while the events sent after 30 seconds have it (and it is google.com).
Check your configuration in Google Tag Manager if there is any strange setting on the referral or something configured that can interfere with the referrer.
In any case, this (referrer) is surely the reason why you have the (not set).

Different Google Index Information

More or less three month ago, I launched my own website. On the first day, I also verified my website for the Google Webmaster Tools, combined them with the Google Analytics Account and submitted a sitemap index file linked to five sitemap files.
But till now, I receive different Google Index Status information:
In Webmaster Tools:
Menu: Crawl -> Sitemaps: 123,861 Urls submitted, 64,313 Urls indexed
Menu: Google Index -> Index Status: 65,375 Urls indexed
When I type in google.de: “site:www.mysite.de”, then I receive 103,000 results.
When I check my website with push2check.net, I receive 110,000 Urls in Google Index.
What is wrong there? I understand that’s impossible for Google to deliver the accurate date because of the distributed processing and the result also depends on the location, where you searching from and so on. But between 65,000 and 110,000 is a huge gap. What’s the reason?
Thanks in advance!
Toby
google.de: “site:www.mysite.de”
You can search this type then google view your site all pages display index by Google.
And
Only Search
push2check.net
Then google View all result display when your website link have.
Then both result is different.

Google Doubleclick blank ads in homepage

I am owner of the website legiaodosherois.com.br and I am having issues after starting my first paid campaign with Google Doubleclick for Publishers. All my ad blocks are set for Adsense fallback and it was working pretty well until two days ago when all my ad blocks started to get empty only at homepage, and running fine in all the other pages. I tried to use the debug console by appending the ?google_force_console=1 at my homepage url but just by adding this piece of query string it already makes the ads shows up, so to reproduce my error you must access http://www.legiaodosherois.com.br/?google_force_console=1 and then access http://www.legiaodosherois.com.br/
I couldn't see any problems by checking this debug data, and Google says that the Ad Units are tagged correctly. What can I do?
UPDATE: I just have found that some specific pages such as http://www.legiaodosherois.com.br/2013/vazam-videos-com-as-primeiras-cenas-de-guardioes-da-galaxia.html also aren't displaying any ads!
Thanks in advance and sorry for my bad English,
Vinicius
It happened because some of my posts received a false positive flag from Adsense bot which forbid these pages to show ads. As they were also being shown in the homepage, it was also forbid to show ads. After cleaning them up, it started to work fine!

Octopress, github pages, CNAME domain and google website search

My blog was successfully transferred to octopress and github-pages. My problem though is that website's search uses google search but the result of 'search' as you can see, are pointing to the old (wordpress) links. Now these links have change structure, following default octopress structure.
I don't understand why this is happening. Is it possible for google to have stored in it's DB the old links (my blog was 1st page for some searches, but gathered just 3.000 hits / month... not much by internet's standards) and this will change with time, or is it something I'm able to change somehow?
thanks.
1.You can wait for Google to crawl and re-index your
pages, or you can use the URL Removal Request tool
to expedite removal of old pages from the index.
http://www.google.com/support/webmasters/bin/answer.py?answer=61062
According to that page, the removal process
"usually takes 3-5 business days."
Consider submitting a Sitemap:
http://www.google.com/support/webmasters/bin/answer.py?answer=40318
click here to resubmit your sitemap.
More information about Sitemaps:
http://www.google.com/support/webmasters/bin/answer.py?answer=34575
http://www.google.com/support/webmasters/bin/topic.py?topic=8467
http://www.google.com/support/webmasters/bin/topic.py?topic=8477
https://www.google.com/webmasters/tools/docs/en/protocol.html
2.Perhaps your company might consider the
Google Mini? You could set up the Mini to
crawl the site every night or even 'continuously'.
http://www.google.com/enterprise/mini/
According to the US pricing page,
the Mini currently starts at $1995 for a
50,000-document license with a year of support.
Here is the Google Mini discussion group:
http://groups.google.com/group/Google-Mini
http://www.google.com/enterprise/hosted_vs_appliance.html
(Click: "show all descriptions")
http://www.google.com/support/mini/
(Google Mini detailed FAQ)

How to tell google that a specific page of my website disappeared and won't come back?

I have a website where 50% of the pages have a limited lifetime.
To give an idea, 4.000 pages appear each week and the same amount disappears.
By "appearing" and "disappearing", I mean that the appearing pages are completely new ones, and disappearing pages are removed from the website forever. There is no "this new page replaces this old page".
I naively used a 410 code on every URL where a page had disappeared.
Meaning the url http://mywebsite/this-page-was-present-until-yesterday.php returned until yesterday a 200 OK code, and returns now a 410 Gone code.
I didn't use no redirect, because I want to tell the user that the URL he accessed isn't wrong, but that it is expired.
The problem is : Google won't acknowledge this information. It is still crawling the pages and Webmaster Tools alerts me as if the page was 404 broken. This affects significantly my "reputation".
Did I do something wrong ? How should I proceed ?
It's always a very good idea to make your own error page. This can save you a lot of visits through broken links.
.htaccess error pages
The Webmaster Tools of Google enables you to delete certain pages.
You can find this under "crawler access".
Try adding a noindex header.