Google analytics: cross domain tracking + 301 redirect - redirect

I'm right now working on multiple websites from the same company each one connected to the others with a list of links on the top header.
The visits tracking is done with Google analytics and everything seems to be working fine. Too bad that they now seem to be unhappy with all the utm* parameters which get attached at the bottom of the url to obtain cross domain tracking.
For me the best solution seems to be this:
each url which links to another one of the sites is like 'www.somename.com/en' where 'en' is the desired language.
After clicking the new page opens with an url like 'www.somename.com/en?_utma=xxxxxxxx&_utmb=...'
If i remove from the links the language changing the href to 'www.somename.com' when the page loads,
the site makes a 301 redirect from 'www.somename.com/?_utma=xxxxxxxx&_utmb=...' to 'www.somename.com/en' where 'en' is the standard language, obtaining in this way exactly what the site owner desires.
Since i don't have access to the G anlaytics account i would like to ask if this might be the right solution or if we may be losing the cross domain tracking.
The cookie __utmz seems to contain the right referrer but I'm not sure if this can be considered enough to check if it is working.
But then i checked the other parameters here http://helpful.knobs-dials.com/index.php/Utma,_utmb,_utmz_cookies and it seems to me that all the other values on the cookies that we get after getting on the new domain don't have to be in any way related to the ones on the previous page (the site with the links).
What else should i check to be sure that everything is still working fine?
Thanks,

You will lose cross domain tracking (that is, even if you might salvage the traffic source the visitor session will be interrupted when changing the domain). One of the parameters added by the linker functions is a hash value (utmk) calculated from the various utm.. parameters. If the hash is missing or does not match the parameters cross domain tracking will be broken. You need to transfer the parameters to a javascript enabled page on the other domain so that the ga cookie can be updated - after that you can do 301 redirects at will.
If you want to avoid utm parameters you can
switch to universal analytics - requires a single parameter to be send; however you can't switch an GA acount to universal analytics, you would have to start from scratch (UPDATE: this is no longer true, you can and indeed should update existing properties)
try to get into the beta for the universal measurement protocol (which would even allow for javascript-less tracking - however you'd still need to send a single id from domain to domain)
So there is no real good solution for you. It is a lot better IMO to have some strange parameters in the url than to do a reload/redirect just to get rid of them.

Related

Redirects and metadata

I wondered if someone could answer this question.
When putting in place 301 redirects for an old website to a new website. Would the metadata from the old website show on Google. If so, what is the best way to resole this?
How will our meta description, google preview and such like be impacted by the redirect? Meaning, will the current ones still show up once the redirect is in place, or will it be the meta description and google preview of the url it is being pointed to?
I guess that question applies to pretty much all of the current site settings/errors. Will we still be ranked on these and therefore is it in our interest to fix any errors with on the old site or should all the focus be on the destination domain, i.e. will any errors or settings on the referring domain no longer matter?

MVC Page routing replaces part of subdomain when subdomain text is also part of the route

I've got a strange problem with routing MVC paths to aspx pages. It all works find except for some rare scenario's. Actually not that rare as it's happened twice this month.
So We've got old aspx pages but we need to have friendlier URL's. That's the background, can't avoid it for reasons I won't go into.
So I have a page ~/MySubFolder/Plans.aspx
We need the URL to be ~/Things/Plans
so I have a page route in route config
routes.MapPageRoute("Tickets", "Things/Plans", "~/MySubFolder/Plans.aspx");
This all works fine in most circumstances.
The app is SaaS product and we determine the tenant in context based on the url they use. So each tenant gets a subdomain on our app like http://clienta.ourapp.com
So this is the problem.
We had a client sign up and they picked their subdomain to be http://plans.ourapp.com
The client does not have any problems except when they try to access our path ~/Things/Plans. when they do that we get an error. It's one of our own exceptions and it happens because on every request we determine who the tenant is by looking at the subdomain.
for some reason when we examine the domain name routing has stripped out the plans part of the sub domain name and is http:// .ourapp.com instead of http://plans.ourapp.com.
So this is obviously caused by the fact that the word plans is the subdomain and plans is also the end of the route Things/Plans
We need to somehow avoid this happening, maybe the route is not setup properly or maybe it's just a bug but would be great to figure out exactly why this is happening so we can fix it.
Thanks
So turns out this had nothing to do with routing the URLs. Somewhere else in code where we try to evaluate the current tenants URL we were for some reason replacing part of the URL based on another part of the URL which was the problem in some scenarios. No wonder no-one had an answer for this

How to prevent Google from indexing redirect URL I do not own

A domainname that I do not own, is redirecting to my domain. I donĀ“t know who owns it and why it is redirecting to my domain.
This domain however is showing up in Googles search results. When doing a whois it also returns this message:
"Domain:http://[baddomain].com webserver returns 307 Temporary Redirect"
Since I do not own this domain I cannot set a 301 redirect, or disable it. When clicking the baddomain in Google it shows the content of my website but the baddomain.com stays visible in the URL bar.
My question is: How can I stop Google from indexing and showing this bad domain in the search results and only show my website instead?
Thanks.
Some thoughts:
You cannot directly stop Google from indexing other sites, but what you could do is add the cannonical tag to your pages so Google can see that the original content is located on your domain and not "bad domain".
For example check out : https://support.google.com/webmasters/answer/139394?hl=en
Other actions can be taken SEO wise if the 'baddomain' is outscoring you in the search rankings, because then it sounds like your site could use some optimizing.
The better your site and domain rank in the SERPs, the less likely it is that people will see the scraped content and 'baddomain'.
You could however also look at the referrer for the request and if it is 'bad domain' you should be able to do a redirect to your own domain, change content etc, because the code is being run from your own server.
But that might be more trouble than it's worth as you'd need to investigate how the 'baddomain' is doing things and code accordingly. (properly iframe or similar from what you describe, but that can still be circumvented using scripts).
Depending on what country you and 'baddomain' are located in, there are also legal actions. So called DMCA complaints. This however can also be quite a task, and well - it's often not worth it because a new domain will just pop up.

Where a redirect is coming from?

I am making a website, where a person could be redirected to a form page several different pages within the site and depending on where they were redirected from, the form would be filled out certain to make it quick for them. This is all on the mobile, so data has to be kept in mind.
That information is usually contained in the HTTP Referer header field.
You can get this data from the headers sent by the browser (referrer URL) - usually these are stored as "Server variables"
However, I would recommend staying clear of this method as it can introduce a few other problems. I would recommend using session/cookies to keep track of the last page the user has visited.

How best to setup 301 redirects from an old site that has many duplicate entries indexed on Google?

I am currently working with a client to redevelop their website. One of the final things I need to do before launch, is to make sure that their old website's pages are correctly redirected to the new URL structure of the new website.
Unfortunately, when I check Google to see how their current site is indexed, this relatively small website appears to have over 1500 pages indexed.
When I look at the indexed links on Google, many appear to be duplicates of the same page, but because of the terrible URI structure used on the old website, Google treats them differently.
For example, the 'Map' page is indexed at least twice on Google, under the following 2 URLs:
www.website.com/frame_page-map.html?mp_session=iris7k85851j05q55piqci31u3&mp_session=iris7k85851j05q55piqci31u3?page_code=map&mp_session=iris7k85851j05q55piqci31u3&mp_session=iris7k85851j05q55piqci31u3
www.website.com/frame_page-map.html?mp_session=sel6m8j5cu8lulep4dqa32sne7&mp_session=sel6m8j5cu8lulep4dqa32sne7?page_code=map&mp_session=sel6m8j5cu8lulep4dqa32sne7&mp_session=sel6m8j5cu8lulep4dqa32sne7
Only the session name is different in the URL (and I have no idea why it is repeated four times in a single URL, either).
For reference, the replacement URL for this page is:
www.website.com/contact/map
My question is: How do I setup a redirect for these multiple records on Google? Do I simply set-up the redirect for the old URL minus all of the URI parameters (i.e. www.website.com/frame_page-map.html) or is there another better method to do this?
Thanks for any help you might be able to offer!
It depends on what your goals are. If you don't care about the querystrings then setup a 301 (permanent redirect) that points to just your root page - map.html. To prevent google from indexing querystring params as separate pages use the canonical tag and have it reference the parent. This isn't guaranteed to work, but google takes your canonical into consideration when indexing.
If you care about the querystring values then you will have to setup a redirect for each one. There is a querystring parameter that you can append to your redirects that will tell it to be ignored so you don't have to write a regex that detects it.