AS400 DB2 Journals search - db2

I am new to DB2 administration on AS400, could you point me to the best practices/tools to search for errors in the DB2 journals?
So far I use the DSPJRN command but I am unable to make research.
thanks.

Can you describe what the "error" is that you are looking for. Journal records my themselves don't really have errors (I think).
I haven't worked on an AS400 for about 10 years, but when I did use it last century I did do some work with Journals looking for the change history of a row over time and found all of the answers I needed in the online manuals.
From memory somehow I think I wrote a export program to save the output of the DSPJRN program and uploaded it into a DB2 Table so that I could query it with SQL.

DSPJRN JRN(<LIBRARY>/QSQJRN) FILE(<LIBRARY>/CXPBU00001) RCVRNG(<LIBRARY>/QSQJRN<JOURNAL_NUMBER)
5=Display entire entry on a record
F6 Display only entry
F15 Display only entry specific data
From their you can get a job description: <NUMBER>/<USER>/<SYSTEM>
wrkjob <NUMBER>/<USER>/<SYSTEM>
And from their option 4 or 10 to see job logs

The journals can be saved in files aka tables. You can create some programs/SQL to search these tables. On the iSeries one does not have other 'native' tools then the iSeries commands and/or some programming/quering.
I don't know if they are any 'non-native' tools. Remember that DB2/400 is really just one of the many implementations of Universal Database/2. I won't be suprised if a Windows or Linux tools can analyze the iSeries implementation too. They same is true for MQ. A typical iSeries command/menu interface on the iSeries itself (which works fine by the way). Beautifull graphical tools on other platforms that can connect to this iSeries MQ.\
On a second thought, a standard tool for the iSeries is the iSeries navigator. I will check this out on my work tomorrow

Related

Import data from cvs excel file to DB2 (mac terminal)

I have a cvs excel file on my computer and I would like to import the data from the file into a table in DB2. Does anyone know how to do this? Thank you
If you are able to connect to the Db2-database at the terminal, then you have several ways to add content into the Db2-database . Each method has different requirements (e.g. the type of the Db2-client, the privileges of your userid , the target Db2-server version and capabilities and platform, and many others...) , and you have to understand the requirements before you can properly use any of the commands. Most of these commands require the fat client of Db2 to be installed on the workstation. If your workstations lack a fat client and if you can transfer the file to a location accessible to the Db2-server then you can also use a stored procedure to run these commands at the Db2-server, but run this from your terminal, see ADMIN_CMD.
All of these options are fully documented in the free online Db2 Knowledge Centre, and each has many special command-line options to control the exact behaviour of the action. Therefore careful study is needed, along with careful rehearsal and testing. Stackoverflow is not a substitute for your education or your training. So it is wise to study the documentation before asking questions that are already answered in the documentation. Sometimes you need to study all the linked pages in the documentation and rehearse each option to fully understand how to use these things for best advantage.
Your options include:
The import command. Slow (logged), and flexible with many options.
Details here
The ingest command. Faster than import and also programmable. Details here.
The load command. Fastest but requires skill and experience to properly harness its power properly especially in high-availability environments. Details here.
Another option is to use external tables, if your Db2-server-platform-and-version supports external tables and if it has the necessary fixes already applied. This lets you represent a flat file (e.g. a CSV file) as a table in the database. In this case, you can use INSERT INTO target-table ....select ... from your-external-table. Requires skill and competence. Details here and many related pages.

Mirror SAP internal data to an external system

We would like to mirror data which is inside SAP to an external database.
Up to now there is a script which exports the data every night.
The customer wants this to happen more often. It should happen every hour.
The export is quite big, and we search for a better way to mirror data which is inside SAP to an external database.
Based on the tag, I assume that your external database is a PostgreSQL database. In this case, I don't think you will really find a pure SAP, database independent solution.
The standard solution for this sort of replication is the SAP SLT Server. It supports taking data out of your SAP system to either a SAP target or a non-SAP target. Currently it supports the following non-SAP targets:
DB2
SAP MaxDB
Microsoft SQL Server
Oracle
Sybase ASE
As you can see, PostgreSQL is not included in there (yet). In conclusion, I see the following possibilities:
Use SLT in combination with some other external DB that is supported.
Use a third party replication tool like for example SymmetricDS.
Depending on your source database, you might be able to use some database specific tools (e.g. SAP HANA Smart Data Integration).
Write some custom code for doing it. In my opinion, you should try to build a sort of log table in this case, to record (using maybe triggers) which rows were inserted / updated / deleted since the last replication. IMO, this should be really a last resort, as database replication is a fairly common topic and you should not reinvent the wheel.

How to create thousands of dummy records in a table - Oracle

I am learning oracle myself with help of internet...
Now, for some scenario I need thousands of records which should be available in my table.
It is not possible to create thousands of records manually...
Is there any tools or any other way to do the same in ORACLE 10g...
As I said I am a novice to Oracle I need some advices from you SOF professionals....
Thanks in advance...
This database has a JDBC driver. Download Eclipse, add this driver to the path and write ten lines of code to insert as much blabla as required, here tutorial. Even if you have never programmed Java before and would not try again, easy enough to do.

How to transfer or copy tables of DB2 to oracle database

I want to transfer some tables of DB2 to oracle daily for accessing them from web page,
But I don't know commands of DB2. How to do this?
I want this action should perform on database daily on particular time, so is there any tool is available to do this operation. And for writing the program for operating above query which programming language should I use? I am using windows XP.
I think Change Data Capture is used to replicate DML from one database to other databases continuously.
However, what you need is to transfer some data at a particular time each day, thus CDC could be too heavy for that.
You could do a simply "db2 export", and then you could import the generated file from Oracle.
There should be an option to create an adapter in Oracle that permits to query DB2 tables. The opposite is called federation in DB2 (InfoSphere Information Server) that permits to query Oracle tables.
Export http://publib.boulder.ibm.com/infocenter/db2luw/v9r7/topic/com.ibm.db2.luw.admin.cmd.doc/doc/r0008303.html
CMD examples http://publib.boulder.ibm.com/infocenter/db2luw/v9r7/topic/com.ibm.db2.luw.admin.dm.doc/doc/r0004567.html
Check this link
http://blogs.oracle.com/warehousebuilder/entry/simple_change_data_capture_from_db2_table_to_oracle_table
In 11.2 releases, Change Data Capture (CDC) can be done by code template mapping. This allows users to capture the data changes from heterogeneous data source, and load into the target across different platforms.

suggest a postgres tool to find the difference between the schema and the data

Dear all ,
Can any one suggest me the postgres tool for linux which is used to find the
difference between the 2 given database
I tried with the apgdiff 2.3 but it gives the difference in terms of schema not the data
but I need both !
Thanks in advance !
Comparing data is not easy especially if your database is huge. I created Python program that can dump PostgreSQL data schema to file that can be easily compared via 3rd party diff programm: http://code.activestate.com/recipes/576557-dump-postgresql-db-schema-to-text/?in=user-186902
I think that this program can be extended by dumping all tables data into separate CSV files, similar to those used by PostgreSQL COPY command. Remember to add the same ORDER BY in SELECT ... queries. I have created tool that reads SELECT statements from file and saves results in separate files. This way I can manage which tables and fields I want to compare (not all fields can be used in ORDER BY, and not all are important for me). Such configuration can be easily created using "dump schema" utility.
Check out dbsolo DBSOLO. It does both object and data compares and can create a sync script based on the results. It's free to try and $99 to buy. My guess is the 99 bucks will be money well spent to avoid trying to come up with your own software to do this.
Data Compare
http://www.dbsolo.com/help/datacomp.html
Object Compare
http://www.dbsolo.com/help/compare.html
apgdiff https://www.apgdiff.com/
It's an opensource solution. I used it before for checking differences between differences in dumps. Quite useful
[EDIT]
It's for differenting by schema only