How do I properly test my database performance with high load demand? - database-performance

I have found a lot of topics about stress-testing web application.
My goals are different, it's to test only database (sybase sql anywhere 9).
What I need:
Some tool to give a diagnostic of all sqls and find a bottleneck. I wish I could macro-view the entire system easily.
Best practices to design/build a good sql queries.
The system issues are:
20GB database size.
2-5 request per second
Thousands sql spread in the code (this messy can be solved only rewriting the system).

The quickest way would actually be to upgrade your SQL Anywhere to v10 or (better) v11, as the latest releases include a complete performance diagnostic toolset. See the documentation here for more details.

several open source tools are listed here:
http://www.opensourcetesting.org/performance.php

Related

What are advantages of using Sphinx as a search engine for data in a PostgreSQL database?

I've liked the idea of using PostgreSQL's built-in text search features to keep my database queries and search queries all in one place.
But are there advantages to using a dedicated search engine and indexer -- and Sphinx in particular -- that I might be missing if I rely solely on Postgres's native search facilities? What are they?
For the moment being Sphinx is faster, as it is a specialized system.
On the other hand, PostgreSQL is also evolving. Take a look at this message from Oleg Bartunov, announcing coming performance improvements to the FTS. And this is the presentation that was given by him on the latest PostgreSQL conference in Prague.
Just look through and decide whether your projects fits into the planned delivery terms, according to the info I can find these changes will be available in 9.3, which is planned “sometime” around comming summer.

IBM DB2 9.7 free monitoring tools

Any good advice on what tools to use to monitor a DB2 database? I have used the db2top command but was wondering if there are more verbose tools out there. Our DB is running on Linux64
As tools, you can use the memory tracker db2mtrk, Problem determination db2pd. But also you can use many other things, such as:
function tables
administrative views
get snapshot
create and active event monitors.
It really depends on what you are goin to do.
Also, the IBM Optim Performance Expert is a good tool to find bttlenecks or issues in the Database. Also the DBI panther brother. And finally the Data Studio Web Console is a basic tool to monitor few elements.
What do you want to monitor? probably with the help of a cron that executes a script you can do many things.

How to deploy/versioning database with Cruise Control Net?

Hi i have configured the basics of cruise control to make releases, and automated nunit test using just MSBuild. Now i'm wondering if is possible to deploy/versioning databases with this?
I'm a beginner at CCNet .So if is possible some suggestions or tutorials (if there are) . Also if someone knows a free tool for database deployment/versioning let me know.. i will be grateful.
Thanks in advance
Hugh
It isn't free but SQL Source Control from RedGate can do what you're looking for, assuming it's a SQL Server database. It has a commandline interface that you can use in CCNet tasks. The easy approach of just migrating up is... easy, the changes are applied to your database schema / data. There was an issue with v2x of the tool that they've overcome with 3, which is that if you were to rename a table column then it would delete the column and create a new one with the right name. Obviously that's quite a big problem if you've got data you want to keep, so with v3 there's the concept of migrations and this allows you to specify alter scripts so instead of dropping the column you could script the change non-destructively.
As far as I know, at this time, they don't have anything that allows you to roll back your version.
Otherwise you could take a look at database migration tools, there seemed to be some promise for these in .Net at least. There is also this post that has some other tools (again for .net) and then there's this https://stackoverflow.com/search?q=database+migration+tool which is not restricted to any language but is general database migrations
If you're still looking for ways to version and migrate databases, one such tool is dbdeploy.net . I've hosted it on github after forking it and doing some work. Latest version is fully up to date and has some interesting features (done by someone who also uses it and sent a pull request).

PostgreSQL Replication Tools

On the postgreSQL's wiki, on the "Replication, Clustering, and Connection Pooling" page ( http://wiki.postgresql.org/wiki/Replication,_Clustering,_and_Connection_Pooling) , it shows the following example on replication's requirements:
"Your users take a local copy of the database with them on laptops when they leave the office, make changes while they are away, and need to merge those with the main database when they return. Here you'd want an asynchronous, lazy replication approach, and will be forced to consider how to handle conflicts in cases where the same record has been modified both on the master server and on a local copy"
And that's pretty much my case. But, unfortunatelly, on the same page, it says: "(...) A great source for this background is in the Postgres-R Terms and Definitions for Database Replication. The main theoretical topic it doesn't mention is how to resolve conflict resolution in lazy replication cases like the laptop situation, which involves voting and similar schemes."
What I want to know, is where can I find material on how to resolve this kind of situation, and wich would be the best way to do this on PostgreSQL.
I will have to check into RubyRep but it seems like Bucardo might be a more widely supported option.
Gabriel Weinberg has an EXCELLENT tutorial on his site for how he uses Bucardo. The guy runs his own search engine called DuckDuckGo and there are quite a few tips and tricks that are optimized for his use cases.
http://www.gabrielweinberg.com/blog/2011/05/replicating-postgresql-with-bucardo.html
Just answering my own question, if anyone ever finds it: I'm using Rubyrep http://www.rubyrep.org/ and it's working.

need to implement versioning in Online backup tool

I am working on the developement of a application that will perform online backup of the files and folder in the PC, automatically or manually. Currently, I was keeping only the latest version of the file at the server.Now, I have to implement the versioning so that only the changes can be transfered to the online server and user must be able to download any of the available version of the file at Backup Server.
I need to perform Deduplication for this. Guys, though I am able to perform it using the fixed block size but facing an overhead of transferring the file having CRC information with each version backup.
I have never worked on such technology , so lacks in experience. I am eager to know is there any feasible method to embedd this functionality in the application without much pain. Is any third party tool would help to perform same thing? Please let me know?
Note: I am using FTP protocol to transfer the data.
There's a program called dump that does something similar, but it operates on filesystem blocks rather than files. rsync also may be of interest.
You will need to keep track of a large number of blocks with multiple versions and how they fit into the various versions of the original files, so you will need some kind of database to track this information, and an efficient way to query it to determine which blocks in a given file need to be transferred. Also note that adding something to the beginning of a file will cause all your blocks to be "new" if you use a naive blocking and diff scheme.
To do this well will be very complex. I highly recommend you thoroughly research already-available solutions, and if you decide you need to write your own, consider the benefits of their designs carefully.