How to block a server by domain name from attacking a website - webserver

I'm having trouble with another website stealing all my blog content. I know that happens all the time, but this new attack is taking down my server.
Looking at my access logs I see he is hitting me from multiple IP address. I can't block them all. Wondering how to block by domain name.
I have full access to my server so I would like to block with iptables, but not sure that is possible.
Also, how do I shut them down, it's a private registration with godaddy. I called them but no help at all.

probably belongs on superuser.com, BUT in code, you could do a quick IP hash map that keeps a running count of visits by IP in the last 10 minutes, then 404's if there are over X number of them. You could then log out the IP's that you are 404ing to, and dump them into iptables to block.

Related

How to point some, but not all of domain to a different nameserver

My client's domain registration is with Network Solutions, but the files, ftp and cPanel for their current site are with another company and are not accessible to me.
We are building a new site at a different domain name and will point the old url to it. The complication is a section of the old site that the client wants to remain at the current url for the next 5 months.
In other words, I want to redirect current-site.com to new-site.com, except for everything in current-site.com/one-section/ which needs to be viewable at that old address.
How can this be done?
If I'm not mistaking this is what you are looking for. I have done it in Godaddy using this instruction on the link which it will hopefully give you a better idea on how to do it. It's a very easy process. You can find the IP of the website you want to redirect to on the left side fo the cpanel or you can do it via command line entering the following: on windows click win key + R type cmd inside of cmd type ping and the address e.g ping www.google.com
https://www.hostmysite.com/support/legacy-control-panel/dns/domain_point/

After repointing DNS with TTL 24 hours, can I use page redirects

I have an existing COM domain with TTL 24 hours. I will be re-pointing the COM domain (A and CNAME records only) to a new website within a few hours.
The new website is already functioning with a NET domain, so it's live online. I'm leaving the COM domain DNS with the old hosting as we have email there.
To handle the 24 hour lag, can I use DNS URL redirects to send visitors to pages on the new, until propagation is complete?
Or can I use a simple HTML redirect to do similar job?
I think the DNS URL redirect is easier for me to set up but will it work, considering the COM will already have been changed?
First off, it's important to plan ahead. If you have a 24 hour TTL, lower it at least 24 hours in advance of the changeover. This would avoid your dilemma (or rather, shorten it to your new, lowered TTL)
From the description, it sounds like the .com and .net sites are identical. Since you didn't mention any database, I assume these are static sites that don't save data to a database. If so, you really could just leave the sites alone and not worry about a redirect. If you want to get people onto the new server asap, use a temporary HTTP redirect or an HTML-based one.
If you do persist data to a database, are you also migrating that at the same time or are the two sites using the same database server? Migrating database servers is a complicated affair, beyond what we can answer here.
(n.b. There is no "DNS URL" redirect. That's just a service that some DNS providers offer, built on top of HTTP and DNS. Setting it up would suffer from the same TTL lag time you're trying to avoid)

1Pc 2Users give access to only one user through MAC Address

I need to let a user access server only from a specific MAC, so he wont be able to access server from another device.
Can any one give me hint how can i achieve that ?
There is no direct way to achieve this
You can of course limit access by protocol/IP/port (with iptables), or at the application level (including IP, with tcpwrappers). You can also limit access based on MAC (with ebtables).
But I am affraid there are no solutions linking user information and MAC address. Too far in the network stack to be usable, perhaps.
What is the exact problem that you are trying to solve by limiting access by MAC and UserID?
Consider a solution like this: initialy, every computer gets a 'temporary' IP from an unprvileged network. That IP does not entitle the user to use any services. Then the user logs in , and after a sucesfull log-in, a new IP from privileged network is issued to him. This IP is specific to this user. From now on the user access can be controlled based solely on his IP.
This will require VLANs and some provisioning made on switches. See Windows server - assign IP by username for a similar case.

Ask Google to Stop Googlebot Crawl

Okay, so a Wordpress gallery plugin lead to a massive headache - with about 17 galleries having their own pagination, the links within created what might as well be infinite number of variant URLs combining the various query variables from each gallery.
As such, Google has been not so smart and has been HAMMERING the server to the tune of 4 gigs an hour prior to my actions, and about 800 requests a minute on the same page sending the server load up to 30 at one point.
It's been about 12 hours, and regardless of the changes I've made, Google is not listening (yet) and is still hammering away.
My question is: Is there a way to contact Google support and tell them to shut their misbehaving bot down on a particular website?
I want a more immediate solution as I do not enjoy the server being bombarded.
Before you say it, even though this isn't what I'm asking about, I've done the following:
Redirected all traffic using the misused query variable back to the Googlebot IP in hopes that the bot being forwarded back to itself will be a wake up call that something is not right with the URL. (I don't care if this is a bad idea)
Blocking the most active IP address from accessing that site.
Disabled the URLs from being created by the troubled plugin.
In Google Webmaster Tools/Search Console, I've set the URL parameters to "No: Doesn't affect page content" for the query variables.
Regardless of all of this, Google is still hammering away at 800 requests per minute/13 requests a second.
Yes, I could just wait it out, but I'm looking for a "HEY GOOGLE! STOP WHAT YOU ARE DOING!" solution besides being patient and allowing resources to be wasted.

How can I show a maintenance page when my web server is down or completely powered off?

I work for a company which has its own web server they are due to have a complete power blackout over the weekend, meaning their servers will be down.
Does anyone know a way we could present a down status on a maintenance page or some kind of redirect so we can at least inform our users that the site is down for maintenance and not just missing/broken?
The best way is probably setting up a redirection to dummy server on your load balancer or border routers. If you have no such thing, then you can either try asking your provider about the options, or temporarily change the DNS record, provided that you reduce DNS cache timeouts before and after the change, so it takes effect immediately.
Set up a server on another location and point their domains DNS record to that server during the blackout.
The redirect has to be carried out by the web server. No web server, no redirect. What you can do is to get another web site by a web hosting company (which will not be subject to your blackout), and configure it to route requests from your main dns to the temp site with just a plain notice html page, then remove it once power is restored. This can be done if you have the dns info from the primary site. You could also mirror the site this way, and then shut down the mirror and no one will be the wiser. Try http://siteground.com I have used them for years.
If you are using a load balancer, see if it supports a "Sorry Server" page. Most of them have this feature built in.