How to prevent Google from indexing redirect URL I do not own - redirect

A domainname that I do not own, is redirecting to my domain. I don´t know who owns it and why it is redirecting to my domain.
This domain however is showing up in Googles search results. When doing a whois it also returns this message:
"Domain:http://[baddomain].com webserver returns 307 Temporary Redirect"
Since I do not own this domain I cannot set a 301 redirect, or disable it. When clicking the baddomain in Google it shows the content of my website but the baddomain.com stays visible in the URL bar.
My question is: How can I stop Google from indexing and showing this bad domain in the search results and only show my website instead?
Thanks.

Some thoughts:
You cannot directly stop Google from indexing other sites, but what you could do is add the cannonical tag to your pages so Google can see that the original content is located on your domain and not "bad domain".
For example check out : https://support.google.com/webmasters/answer/139394?hl=en
Other actions can be taken SEO wise if the 'baddomain' is outscoring you in the search rankings, because then it sounds like your site could use some optimizing.
The better your site and domain rank in the SERPs, the less likely it is that people will see the scraped content and 'baddomain'.
You could however also look at the referrer for the request and if it is 'bad domain' you should be able to do a redirect to your own domain, change content etc, because the code is being run from your own server.
But that might be more trouble than it's worth as you'd need to investigate how the 'baddomain' is doing things and code accordingly. (properly iframe or similar from what you describe, but that can still be circumvented using scripts).
Depending on what country you and 'baddomain' are located in, there are also legal actions. So called DMCA complaints. This however can also be quite a task, and well - it's often not worth it because a new domain will just pop up.

Related

MVC Page routing replaces part of subdomain when subdomain text is also part of the route

I've got a strange problem with routing MVC paths to aspx pages. It all works find except for some rare scenario's. Actually not that rare as it's happened twice this month.
So We've got old aspx pages but we need to have friendlier URL's. That's the background, can't avoid it for reasons I won't go into.
So I have a page ~/MySubFolder/Plans.aspx
We need the URL to be ~/Things/Plans
so I have a page route in route config
routes.MapPageRoute("Tickets", "Things/Plans", "~/MySubFolder/Plans.aspx");
This all works fine in most circumstances.
The app is SaaS product and we determine the tenant in context based on the url they use. So each tenant gets a subdomain on our app like http://clienta.ourapp.com
So this is the problem.
We had a client sign up and they picked their subdomain to be http://plans.ourapp.com
The client does not have any problems except when they try to access our path ~/Things/Plans. when they do that we get an error. It's one of our own exceptions and it happens because on every request we determine who the tenant is by looking at the subdomain.
for some reason when we examine the domain name routing has stripped out the plans part of the sub domain name and is http:// .ourapp.com instead of http://plans.ourapp.com.
So this is obviously caused by the fact that the word plans is the subdomain and plans is also the end of the route Things/Plans
We need to somehow avoid this happening, maybe the route is not setup properly or maybe it's just a bug but would be great to figure out exactly why this is happening so we can fix it.
Thanks
So turns out this had nothing to do with routing the URLs. Somewhere else in code where we try to evaluate the current tenants URL we were for some reason replacing part of the URL based on another part of the URL which was the problem in some scenarios. No wonder no-one had an answer for this

Google analytics: cross domain tracking + 301 redirect

I'm right now working on multiple websites from the same company each one connected to the others with a list of links on the top header.
The visits tracking is done with Google analytics and everything seems to be working fine. Too bad that they now seem to be unhappy with all the utm* parameters which get attached at the bottom of the url to obtain cross domain tracking.
For me the best solution seems to be this:
each url which links to another one of the sites is like 'www.somename.com/en' where 'en' is the desired language.
After clicking the new page opens with an url like 'www.somename.com/en?_utma=xxxxxxxx&_utmb=...'
If i remove from the links the language changing the href to 'www.somename.com' when the page loads,
the site makes a 301 redirect from 'www.somename.com/?_utma=xxxxxxxx&_utmb=...' to 'www.somename.com/en' where 'en' is the standard language, obtaining in this way exactly what the site owner desires.
Since i don't have access to the G anlaytics account i would like to ask if this might be the right solution or if we may be losing the cross domain tracking.
The cookie __utmz seems to contain the right referrer but I'm not sure if this can be considered enough to check if it is working.
But then i checked the other parameters here http://helpful.knobs-dials.com/index.php/Utma,_utmb,_utmz_cookies and it seems to me that all the other values on the cookies that we get after getting on the new domain don't have to be in any way related to the ones on the previous page (the site with the links).
What else should i check to be sure that everything is still working fine?
Thanks,
You will lose cross domain tracking (that is, even if you might salvage the traffic source the visitor session will be interrupted when changing the domain). One of the parameters added by the linker functions is a hash value (utmk) calculated from the various utm.. parameters. If the hash is missing or does not match the parameters cross domain tracking will be broken. You need to transfer the parameters to a javascript enabled page on the other domain so that the ga cookie can be updated - after that you can do 301 redirects at will.
If you want to avoid utm parameters you can
switch to universal analytics - requires a single parameter to be send; however you can't switch an GA acount to universal analytics, you would have to start from scratch (UPDATE: this is no longer true, you can and indeed should update existing properties)
try to get into the beta for the universal measurement protocol (which would even allow for javascript-less tracking - however you'd still need to send a single id from domain to domain)
So there is no real good solution for you. It is a lot better IMO to have some strange parameters in the url than to do a reload/redirect just to get rid of them.

Redirects in Ektron 8.6.1

Has anyone played with the new redirect feature in Ektron 8.6?
We tested it (in 8.6.0) before upgrading and were happy with it. But when it came time to do the upgrade, Ektron had released 8.6.1, so we upgraded directly to that.
Now we are having trouble with the redirect feature. (Yes, we should have tested everything again in 8.6.1 before upgrading)
Now if we try to add a redirect rule for an existing page in the CMS, it does not work.
But if we create a redirect rule for a page the does not exist, then try to hit that address, the redirect works fine.
We need the redirects to work for existing pages in the CMS.
To clarify what "working" and "not working" means...
If I have an existing page in the CMS with manual alias of "/erc/lucien.apsx", I can create an entry in the redirect table like this...
Adding this entry generates no errors, but when I visit the page, all I see is the regular old page I created. NOT the Google site it should be redirecting to. I do not get any 404 errors.
But if I create a redirect entry for a page that does not already exist, like this...
It works perfectly. If I try to visit the /erc/fake.apsx address, I end up on the Google site, as expected.
(FYI, we create a "fake" page in the CMS for external content so we can attach metadata to it and make it searchable in taxonomies, but then provide a link to the "real" page. I want to use redirects here so users don't have to do this extra click)
I suspect it might be cache related -- the original URL gets cached as an alias, then subsequent requests to that URL are redirected to the quicklink without the need for a db look up. When you add the redirect, it’s probably not clearing the old item from the cache. I'd try an IIS reset after you add the URL redirect and see if that clears up the issue.
An "outside the box" (of Ektron) answer to this is to place the redirect at the web server rather than in the Aliases section of the Ektron CMS.
The server I work on uses IIS and I have this set up for several pages.

Domain blocked and no data scraped

I recently purchased the domain www.iacro.dk from UnoEuro and installed WordPress planning to integrate blogging with Facebook. However, I cannot even get to share a link to the domain.
When I try to share any link on my timeline, it gives the error "The content you're trying to share includes a link that's been blocked for being spammy or unsafe: iacro.dk". Searching, I came across Sucuri SiteCheck which showed that McAfee TrustedSource had marked the site as having malicious content. Strange considering that I just bought it, it contains nothing but WordPress and I can't find any previous history of ownership. But I got McAfee to reclassify it and it now shows up green at SiteCheck. However, now a few days later, Facebook still blocks it. Clicking the "let us know" link in the FB block dialog got me to a "Blocked from Adding Content" form that I submitted, but this just triggered a confirmation mail stating that individual issues are not processed.
I then noticed the same behavior as here and here: When I type in any iacro.dk link on my Timeline it generates a blank preview with "(No Title)". It doesn't matter if it's the front page, a htm document or even an image - nothing is returned. So I tried the debugger which returns the very generic "Error Parsing URL: Error parsing input URL, no data was scraped.". Searching on this site, a lot of people suggest that missing "og:" tags might cause no scraping. I installed a WP plugin for that and verified tag generation, but nothing changed. And since FB can't even scrape plain htm / jpg from the domain, I assume tags can be ruled out.
Here someone suggests 301 Redirects being a problem, but I haven't set up redirection - I don't even have a .htaccess file.
So, my questions are: Is this all because of the domain being marked as "spammy"? If so, how can I get the FB ban lifted? However, I have seen examples of other "spammy" sites where the preview is being generated just fine, e.g. http://dagbok.nu described in this question. So if the blacklist is not the only problem, what else is wrong?
This is driving me nuts so thanks a lot in advance!
I don't know the details, but it is a problem that facebook has with web sites hosted on shared servers, i.e. the server hosting your web site also hosts a number of other web sites.

Workaround: site is www.example.com code incl. document.domain='example.com'

A customer site that I cannot change has the line document.domain = "example.com" while the site is at www.example.com.
The effect is that FaceBook Connect window login gets stuck after submitting username+password.
Firebug shows its in infinite loop inside dispatchmessage function, which gives perpetual exception:
Error: Permission denied for <http://www.example.com> to get property Window.FB from <http://example.com>
Any idea how to work around this? I prefer not to ask the customer to remove the document.domain='example.com'
It seems like a really bad idea to tell the visitor's browser that the website is being served from a particular domain, when it in fact is not. The best solution would be to change that line. I take it you don't want to change it because they have some client-side code that depends on this?
One workaround would be to change the Facebook application's Connect URL to http://example.com, since Facebook's JavaScript will think that is where it is being executed.