I am doing small research work on how to protect database against DDoS atack.
I am using postgres database in my testings.
I want to perform a DDoS of Database on my local machine. But I don't really know how to do it.
My plan is to create script that runs bunch of queries. But I want these queries took as much time to complete as possible.
I saw this example: select tab1 from (select decode(encode(convert(compress(post) using latin1),concat(post,post,post,post)),sha1(concat(post,post,post,post))) as tab1 from table_1)a;
But I am failing to replicate it in postgres.
I need help in translating this query in postgres or other examples of functions or queries that would take lots of time to complete.
Edit:
sleep functions might not work. They are not loading system enough.
In my understanding, DDoS should be performed with functions that are taking a long time to perform and sucks up tons of compute powers of the system.
Related
I have to perform a monthly maintenance to a postgres database.
I puTTy into the system, navigate to the database and then run 3 commands on 40 different tables:
CLUSTER [table1] USING [primarykey];
ANALYZE [table1];
REINDEX TABLE [table1];
I have to wait for each command to finish executing before I can run the next one (i.e. CLUSTER, -wait up to a few minutes-, ANALYZE -wait-, REINDEX -wait-, )
It's very simple to do but it takes around 30-45 minutes of me just copying and pasting 120 lines, one line at a time... is there any way to automate this process?
I have zero experience with scripting and I know very little about postgreSQL.
My question is somewhat unique because I cannot install anything in the postgreSQL database. I want to have this script localized on my computer and then be able to run it when it's time for the maintenance.
Clustering automatically reindexes the table. There is no reason to reindex the table immediately after you cluster it.
Do you actually need to do this stuff? Do you have evidence that your tables are in need of clustering? Or you just assuming they do because of something you read off the internet referring to a decade-old version of PostgreSQL written by someone who didn't know what they were talking about in the first place? It is possible you really would benefit from this. It is even more possible you wouldn't, and it is just a waste of time.
If you know nothing about scripting, then you need to learn something about scripting. You should probably tag your post as being about scripting, in whichever shell/language you would like to use.
At the core, all you have to do is write a series of commands to be executed from the command line, and shove them into a text file. The easiest way is probably to install psql on your local computer, if it is not already there.
psql -c 'cluster foobar' -h thehost.example.com
psql -c 'analyze foobar' -h thehost.example.com
You might need to do some configuration to make this connection work with whatever authentication method you have in place, but without knowing which authentication method that is I can't comment further.
If the cluster for some reason fails, there is little reason to proceed to try to analyze it. (But there is also little harm in doing so). If you want to fine tune this situation, there are a variety of ways to do it, depending on which shell you are writing your script for, and what you want it to do.
I've upgraded a server from SQL Server 2005 to SQL Server 2008 but the database runs slower when running certain stored procedures especially against records which contain more data than others.
It's been suggested that I run a basic reindex to see if this resolves.
Can someone take a look at the screenshot and advise if this will remove any data from my database - if so then this isn't the right thing to do.
Thanks James
p.s I will now attach a screen-shot if I can as not done that before using this Forum
Those actions won't remove any data from the database, but generally I wouldn't advise trying to shrink the database unless you really need the space as this can cause more fragmentation of indexes. The only options that you have ticked there that have the ability to improve performance are the rebuild/reorganise indexes and the update statistics options.
Rather than maintenance plans though I would generally recommend using Ola Hallengren's DB maintenance scripts though as they offer more flexibility and are generally a lot better than these plans:
Ola Hallengren - SQL Server Maintenance Solution
I am working on a project where i want to give people the possibility to execute SQL queries on an PostgreSQL database. I then only need to prevent people from hacking/attacking my database.
I thought that maybe a way to do that, is by giving only view access to de database connection. And using EXPLAIN ANALYSE to calculating the cost of the SQL query.
Is EXPLAIN ANALYSE trustworthy enough to make sure there are no cheap ways to get the website down?
Do you have suggestions?
EXPLAIN ANALYSE will execute the query, including any side-effects it may have. PostgreSQL also allows running arbitrary Perl and Python code if configured to do so, so be careful. You're likely better off running PostgreSQL instances in per-request VMs or in similar highly isolated environments.
There is a web application which is running for a years and during its life time the application has gathered a lot of user data. Data is stored in relational DB (postgres). Not all of this data is needed to run application (to do the business). However form time to time business people ask me to provide reports of this data data. And this causes some problems:
sometimes these SQL queries are long running
quires are executed against production DB (not cool)
not so easy to deliver reports on weekly or monthly base
some parts of data is stored in way which is not suitable for such
querying (queries are inefficient)
My idea (note that I am a developer not the data mining specialist) how to improve this whole process of delivering reports is:
create separate DB which regularly is update with production data
optimize how data is stored
create a dashboard to present reports
Question: But is there a better way? Is there another DB which better fits for such data analysis? Or should I look into modern data mining tools?
Thanks!
Do you really do data mining (as in: classification, clustering, anomaly detection), or is "data mining" for you any reporting on the data? In the latter case, all the "modern data mining tools" will disappoint you, because they serve a different purpose.
Have you used the indexing functionality of Postgres well? Your scenario sounds as if selection and aggregation are most of the work, and SQL databases are excellent for this - if well designed.
For example, materialized views and triggers can be used to process data into a scheme more usable for your reporting.
There are a thousand ways to approach this issue but I think that the path of least resistance for you would be postgres replication. Check out this Postgres replication tutorial for a quick, proof-of-concept. (There are many hits when you Google for postgres replication and that link is just one of them.) Here is a link documenting streaming replication from the PostgreSQL site's wiki.
I am suggesting this because it meets all of your criteria and also stays withing the bounds of the technology you're familiar with. The only learning curve would be the replication part.
Replication solves your issue because it would create a second database which would effectively become your "read-only" db which would be updated via the replication process. You would keep the schema the same but your indexing could be altered and reports/dashboards customized. This is the database you would query. Your main database would be your transactional database which serves the users and the replicated database would serve the stakeholders.
This is a wide topic, so please do your diligence and research it. But it's also something that can work for you and can be quickly turned around.
If you really want try Data Mining with PostgreSQL there are some tools which can be used.
The very simple way is KNIME. It is easy to install. It has full featured Data Mining tools. You can access your data directly from database, process and save it back to database.
Hardcore way is MADLib. It installs Data Mining functions in Python and C directly in Postgres so you can mine with SQL queries.
Both projects are stable enough to try it.
For reporting, we use non-transactional (read only) database. We don't care about normalization. If I were you, I would use another database for reporting. I will desing the tables following OLAP principals, (star schema, snow flake), and use an ETL tool to dump the data periodically (may be weekly) to the read only database to start creating reports.
Reports are used for decision support, so they don't have to be in realtime, and usually don't have to be current. In other words it is acceptable to create report up to last week or last month.
is it possible to run an execution plan directly in PostgreSQL?
I did not find anything about it after quite some search in the PostgreSQL document and on the internet.
No, it is not possible to directly execute a query plan in PostgreSQL. You must run actual SQL.
In theory you could customise the PostgreSQL executor to accept plans without the corresponding SQL by feeding in plan trees. This would be a pretty big job and I'm sure there are many things that'd make it harder that I don't even know about.
You really need to just run SQL.
There is no reverse-compiler to turn an execution plan back into SQL.