Is there any way in sqlmap(sql-injection testing tool) to fetch database tables without running the complete test? - sql-injection

Is there any way in sqlmap(sql-injection testing tool) to fetch database tables without running the complete test?
When I test a URL it takes a long time to Complete the whole test and retrieve database tables. Is there any shorter way to do fetch database or database tables ?

Related

PostgreSQL external transaction feature

I have a big web application and tests which make requests to app running in sandbox. After each test I used to rollback database using db migrate rollback && db migrate && db seed. But now after test quantity rised, It takes much time. So, I am looking for feature which can wrap some amount of database command into a transaction and after test finish cancel transaction without modifying app source code (or make this by another way). May be there are some postgres database parameters or extensions?
I found another way..
I can make a dump one time and then drop and restore dump every time next, much faster)
look this topic:
Truncating all tables in a Postgres database

How to run SQL script without losing data in Pg Admin |||?

I am working on PostgreSQL database and after some months I have added new columns in many tables according to the new requirements on my test server and also I have done changes in functions.
Now what I want that I want to do same changes on my Live server without losing data.
I have taken backup on test server schema using below answer
https://stackoverflow.com/a/7804825/7223676

Slow performance calling DbContext.Database.CompatibleWithModel

we're running EntityFramework (6.1.3) Code-First with about 40 migrations on a small database (490 MB) on SQL Azure.
While extending our release process with a check to call Database.CompatibleWithModel to verify if the database is up to date, I've found that our test instance got rather slow: this call will take up to 2 minutes to complete. On our production environment and locally this is less than a second!
I've run a SQL profiler job locally and found only 3 statements that get fired on the the database:
Does the database exist?
How many migrations are in the database?
What is the migrationId and model of the last migration?
When I run these statements on the slow environment it completes within a second, just like on production. This indicates that the database is not an issue.
Does anyone have an idea were to search for the reason of the slow performance? Does EF calculate the hash locally on runtime and then checks that against the model in the database?
Can I find out how long it takes to hash?
I've checked the settings on the Azure Database to check against the CompatibilityLevel of the database, after finding a post that the EDMX editor is slow with the new cardinality estimator, but this setting seems to have no effect.

db2 data load runs slowly unless test data applied

I have a database build and a reference data build which are loaded onto my computer. When I try to load the transaction data from a file to the database via staging tables and stored procedures it takes 20 minutes to load 10,000 records.
If I load the database build, reference data build and also my test data then loading the transaction data via the same process takes between 40-50 seconds.
I am trying to find out what causes the process to speed up when test data is added and have considered that it could be that the database may have worked out the best route to inserting the transaction data by loading the test data first, but I wouldn't expect it to be this big a difference in time.
Can anyone recommend what I could do to identity the problem or have any ideas to what it could be?
Thanks for any help.

Is there a way to persist HSQLDB data?

We have all of our unit tests written so that they create and populate tables in HSQL. I want the developers who use this to be able to write queries against this HSQL DB ( 1) by writing queries they can better understand the data model and the ones not as familiar with SQL can play with the data before writing the runtime statements and 2) since they don't have access to the test DB/security reasons). Is there a way to persist the results of the test data so that it may be examine and analyzed with a an sql client?
Right now I am jury rigging it by switching the data source to a different DB (like DB2/mysql, then connecting to that DB on my machine so I can play with persistant data), however it would be easier for me if HSQL supports persisting this than to explain how to do this to every new developer.
Just to be clear, I need an SQL client to interact with persistent data, so debugging and checking memory won't be clean. This has more to do with initial development and not debugging/maintenance/testing.
If you use an HSQLDB Server instance for your tests, the data will survive the test run.
If the server uses a jdbc:hsqldb:mem:aname (all-in-memory) url for its database, then the data will be available while the server is running. Alternatively the server can use a jdbc:hsqldb:file:filepath url and the data is persisted to files.
The latest HSQLDB docs explain the different options. Most of the observations also apply to older (1.8.x) versions. However, the latest version 2.0.1 supports starting a server and creating databases dynamically upon the first connection, which can simplify testing a lot.
http://hsqldb.org/doc/2.0/guide/deployment-chapt.html#N13C3D