Customizing DB2 Command Line Processor - db2

I have written a program in java that reads .csv files and stores them into a database table. But the performance of the storing operation is very slow. When I use DB2 Command Line Processor there is a drastic change in performance and it's very fast. So, I am trying to customize DB2 Command Line Processor according to my requirement. I searched on Google but I only found topics for how to use it. I would like to get clear on following subjects before I start.
Is "DB2 Command Line Processor" open source?
Which programming language is used?
Is there alternative like DB2 Command Line Processor with open source-code in java?
Is there a way to call DB2 Command Line Processor out of a java program?

It may be worth investigating the Java program, the slow run times may be related to how often you are commiting the data (i.e. you may running in auto-commit mode (commiting after every insert)).
Committing after every 500 insert may be a lot faster than commiting after every record
see DB2 autocommit for details on auto-commit

1) DB2 CLP (command line processor) is part of DB2. It is not open source, and it is included in all editions (Express-C, express, workgroup, extended), and in the Data Server client. This last is free to download, and install in all clients.
2) The best way to use the capabilities os DB2CLP is via scripts, such as bash scripts or windows scripts.
You can also call the db2clp from another program, such as a java application (runtime).
3) There are shells for databases with open source licence, however, you are mixing two things: a shell, that is normally a black screen where you type commands. And a driver to query a database from a program developed by yourself.
4) Again, via Runtime, http://docs.oracle.com/javase/6/docs/api/java/lang/Runtime.html
Finally, the best is to use a JDBC driver, in order to do things directly, and not with a lot of tiers. You have to check your Java code, probably the reading is not efficient. And also, check the properties of the DB2 Java driver.
One more thing, if you want the fatest, try to use LOAD to insert data in the database. It does not perform any log. You can call LOAD from a java application (remember to load the db2 environment before executing any command)

Related

db2 resource limitation error while using ibm-db

I have a generic question (not an issue). I am trying to run a big query with (lot of join conditions connecting 15-20 tables). Do we have any limitations in ibm_db while running big queries ? The query has been running in our production environment for more than 15 years. I am able to run the query in a in-home .Net tool. However, while running it using ibm-db in pycharm I keep getting sqlcode -905 resource limitation error. Is there anything I am missing with ibm-db usage ?
Any insight will be helpful. Thank you for the help.
Most likely this -905 sqlcode has nothing to do with python or ibm_db.
Instead, it is more likely due to how the workload/resource management is configured at the Db2 server. Your question gave zero facts about the target Db2 environment, or about the difference(s) between the execution that works (.net) versus the execution that triggers the limitation.
One specific detail to eliminate is that the account (auth-id) used for the .net application might be different to the account you use when connecting from python. The Db2-server may be configured to allocate based on User-Id (auth-ID) , or other client side factor (depending on the Db2-server platform and version).
You can prove that the -905 symptom has nothing to do with python or ibm_db by temporary eliminating both, for example by submitting the same query from the db2cli tool (or db2 clp if your client workstation has it), or by submitting the query from jdbc (as long as you use the same account name for connecting as you do with python).
Contact your DBA team for details of the configuration of the WLM or RLF or whatever resource management tooling is deployed at the target Db2 subsystem. In addition, use python ibm_db to print out the full details of the exception (including the resource name, limit amount1/2, limit source, as they can also yield more information).

MySQL workbench: multiple windows

MySQL workbench is a fantastic tool. However, I'm having a hard time figuring out how to create multiple windows. For example, in Sequel Pro (or TablePlus or really any other SQL client) I can have multiple windows:
Yes I know there are 'tabs' but those aren't quite the same thing. Is there a way to have multiple windows using MySQL Workbench?
It seems like from a few other threads this would need to be done manually via:
$ open -n -a MySQLWorkbench.app
MySQL Workbench is originally designed to be a single instance app only. On Windows this has been extended to allow multiple instances (there's a setting in the preferences) and you found a way to do this on macOS. However this bears some risks, because all instances share the same config and cache files and can write simultaneously to them, which is prone to file corruption. Also, any changes done to the configuration or connections end up in the same file, so the last change may override previously made changes in another instance.

How to create JOB in AS400?

I'm new to AS400 DB2 , I have 3 procedures in AS400, now those procedure should be execute by calling single job in AS400.
Can you tell me please how to create a job and execute job in AS400 DB2 Mainframe ?
You're probably not using an AS/400 running OS/400; they have been obsolete for 10 years. You're probably using a POWER server running IBM i.
POWER servers are not mainframes, they are mid-range systems.
On the IBM i, a job is how the operating system organizes, tracks, and processes work. On other platforms, you'd call it a process.
Jobs are the basis of work management
When you sign onto a 5250 session, the system starts an "interactive job" to service your requests. You can't call procedures directly, but you can call a program that then calls the procedures.
There are plenty of common jobs tasks. Including the use of Submit Job (SBMJOB) command to submit a job to run in batch.
From what I understand, you want to run 3 programs in by invoking a single command. Is that it?
In OS/400 (haven't used it in over a decade) you put the commands for the programs into a CL source and compile it. Unlike .bat files on Windows, CL isn't going to run without compilation.
Refer to https://www.ibm.com/support/knowledgecenter/ssw_ibm_i_73/rbam6/creat.htm for how to create a CL program.

Is the EFI shell flexible enough to loop over boot entries?

I'm trying to write an EFI shell script which deletes all boot entries (as given in bcfg dump boot), without knowing how many exist ahead-of-time.
The language provides a looping construct, patterned off that from Microsoft's shells:
for var in <set>
...
endfor
...but I'm unclear on whether there's a reasonable way to get the numeric identifiers of the boot entries from bcfg dump into the <set>.
At this moment (UEFI Shell v2.1 and UEFI v2.50) there is no way for parsing bcfg output using UEFI Shell.
The only supported method for parsing in UEFI Shell script is by using parse command, which require Standard-Format Output (it seems to be CSV). Unfortunately only 7 commands can generate SFO, by using -sfo flag as parameters. Supported commands are: ls, map, memmap, date, dh, devices, drivers.
Removing all boot options can be achieved by writing simple C application that mimic bcfg behavior. I managed to do that and sample code can be found here.
Note that removing all boot options can be dangerous in some cases and can lead to unrecoverable state of your hardware. Make sure you know what you doing.

Perl local libraries - Sybase

I'm going to build a extremly small script for dumping a Sybase database in perl. The problem is that Perl doesn't come with preinstalled Sybase-support. I don't have access to the servers root so I can't install any packages and I can't reach the perl-folder. The server is not configured for internet access so I have to deliver the packages "manually" thorugh FTP.
So, my question is if there are any easy ways of doing this. The only library I need is DBI::Sybase or Sybase standalone (maybe I haven't done my research enough and doesn't even need this much?) which means I would love to just be able to put the .pm file there, loading it through
use localModule
and then run my small script.
The solution has to work on both Red hat and Solaris if I understood my supervisor correctly.
Best regards
Since you are primarily concerned with dumping the database, and not data retrieval and manipulation, you could probably get by without having to use DBI::Sybase or other perl module that is not preinstalled.
Without more details, it's hard to be very specific, but here's the overview. Your perl script can execute some SQL scripts which can dump the databases.
You can either put the list of databases you wish to dump in a config file (or env file), or you can generate it dynamically by calling isql using the -b option to suppress headers, and nocount to suppress footers, and store the output in an array.
Once you have the list of databases, just loop them, running another isql command to dump each database.