How to create JOB in AS400? - db2

I'm new to AS400 DB2 , I have 3 procedures in AS400, now those procedure should be execute by calling single job in AS400.
Can you tell me please how to create a job and execute job in AS400 DB2 Mainframe ?

You're probably not using an AS/400 running OS/400; they have been obsolete for 10 years. You're probably using a POWER server running IBM i.
POWER servers are not mainframes, they are mid-range systems.
On the IBM i, a job is how the operating system organizes, tracks, and processes work. On other platforms, you'd call it a process.
Jobs are the basis of work management
When you sign onto a 5250 session, the system starts an "interactive job" to service your requests. You can't call procedures directly, but you can call a program that then calls the procedures.
There are plenty of common jobs tasks. Including the use of Submit Job (SBMJOB) command to submit a job to run in batch.

From what I understand, you want to run 3 programs in by invoking a single command. Is that it?
In OS/400 (haven't used it in over a decade) you put the commands for the programs into a CL source and compile it. Unlike .bat files on Windows, CL isn't going to run without compilation.
Refer to https://www.ibm.com/support/knowledgecenter/ssw_ibm_i_73/rbam6/creat.htm for how to create a CL program.

Related

Using SQL developer to automate several sql scripts

I have several scripts that I run in SQL Developer every day. I have it set up as #script_1.sql; #script_2.sql; #script_n.sql; and I hit F5 and let it run through all my scripts and then look at my results when it has finished parsing through all of them. I want to have this automated so it is finished running those scripts by the time I get into work. My system admins have disabled task scheduler. I have a few jobs set up that work fine but when I tried turning this into a job it wouldn't work. How would I automate this using only SQL Developer?
You could create a package and convert those SQL files into its procedures. Then schedule their execution with DBMS_JOB or DBMS_SCHEDULER package.

Run Powershell script every hour on Azure

I have found this great script which backs up SQL Azure database to BLOB.
I want to run many different variations of this script - e.g. DB1 goes to Customer1Blob, DB2 goes to Customer2Blob.
I have looked at Scheduler Job Collections. However I can only see options (Action settings) for HTTP(S)/ Storage Queue / Service Bus.
Is it possible to run a specific .ps1 script (with commands) scheduled?
You can definitely run a Powershell script as a WebJob. If you want to run a script on a schedule, you can add a settings.job file containing a chron expression with your webjob. The docs for doing so are here.
For this type of automation tasks, I prefer to use the Azure Automation service. You can create runbooks using powershell and then schedule this with the use of the Azure scheduler. You can have it run "on azure" so you do not need to use compute power that you pay for (rather you pay by the minute the job runs) or you can configure it to run with a hybrid worker.
For more information, please see the documentation
When exporting from SQL DB or from SQL Server, make sure you are exporting from a quiescent database. Exporting from a database with active transactions can result in data integrity issues - data being added to various tables while they are also being exported.

Is it possibile to remotely process an SSAS cube throgh script?

I have an SQL Server Analysis Service (SSAS) cube (developed with BIDS 2012) and I would like to give the opportunity to the users (that use cube through PowerPivot) to process the cube in their local machines.
I found some material on how to make a scheduled job on the server through Powershell or SQL Agent or SSIS but no material on remotely process the cube. Any advice?
There are several possibilities to trigger a cube processing. The low level method is issuing an XMLA statement to the database containing the cube. To see how this looks like, open SQL Server Management Studio, connect to the AS instance, right-click on an AS database, and select "Process". Configure the processing settings, but instead of hitting OK, select "Script from the top toolbar to have the XMLA process command be generated for you. Leave the dialog with Cancel.
All methods that process a cube end in some way or the other in sending a command like this to the AS database.
There are several options to trigger a cube processing:
In Management Studio, by clicking OK in the above mentioned dialog.
In PowerShell (see http://technet.microsoft.com/en-us/library/hh510171.aspx).
In Integration Services, there is an Analysis Services processing task (http://msdn.microsoft.com/en-us/library/ms141779.aspx).
You can set up a SQL Server Agent job, job steps could either be a direct XMLA step, or an Integration Services step containing the process task (among possibly other tasks).
The question, however, is how the setups described above can be accessed by end users. An important issue here is of course that the user executing the process task needs to have the permission to process the cube. As you might not want to give this permission directly, it might make sense to use some impersonation on the way of calling it. With Management Studio - and as far as I am aware with PowerShell - this cannot easily be achieved.
Integration services and Agent jobs offer the possibility of impersonations. Integration services packages are executed by the dtexec command line tool (part of the SQL Server client tools), there is also a tool called dtexecui (available as "Execute Package Utility" in a standard SQL Server client tool installation), which lets you use a dialog to configure all settings, and then execute a package, but it also can display the command line for dtexec, according to your settings.
And to call a SQL Server Agent job, an easy interface are the stored procedures (http://msdn.microsoft.com/en-us/library/ms187763.aspx), especially sp_start_job (Note this is asynchronous, you call it, it starts the job and returns. It does not wait for the job to complete before returning.) and sp_help_jobactivity to ask for job status as well as sp_help_jobhistory for details of jobs that were running.
All in all I think there is no final solution available, but I mentioned some building blocks that you could use to code your own solution, depending on the preferences in your environment.

Customizing DB2 Command Line Processor

I have written a program in java that reads .csv files and stores them into a database table. But the performance of the storing operation is very slow. When I use DB2 Command Line Processor there is a drastic change in performance and it's very fast. So, I am trying to customize DB2 Command Line Processor according to my requirement. I searched on Google but I only found topics for how to use it. I would like to get clear on following subjects before I start.
Is "DB2 Command Line Processor" open source?
Which programming language is used?
Is there alternative like DB2 Command Line Processor with open source-code in java?
Is there a way to call DB2 Command Line Processor out of a java program?
It may be worth investigating the Java program, the slow run times may be related to how often you are commiting the data (i.e. you may running in auto-commit mode (commiting after every insert)).
Committing after every 500 insert may be a lot faster than commiting after every record
see DB2 autocommit for details on auto-commit
1) DB2 CLP (command line processor) is part of DB2. It is not open source, and it is included in all editions (Express-C, express, workgroup, extended), and in the Data Server client. This last is free to download, and install in all clients.
2) The best way to use the capabilities os DB2CLP is via scripts, such as bash scripts or windows scripts.
You can also call the db2clp from another program, such as a java application (runtime).
3) There are shells for databases with open source licence, however, you are mixing two things: a shell, that is normally a black screen where you type commands. And a driver to query a database from a program developed by yourself.
4) Again, via Runtime, http://docs.oracle.com/javase/6/docs/api/java/lang/Runtime.html
Finally, the best is to use a JDBC driver, in order to do things directly, and not with a lot of tiers. You have to check your Java code, probably the reading is not efficient. And also, check the properties of the DB2 Java driver.
One more thing, if you want the fatest, try to use LOAD to insert data in the database. It does not perform any log. You can call LOAD from a java application (remember to load the db2 environment before executing any command)

Run SSIS Package from T-SQL

I noticed you can use the following stored procedures (in order) to schedule a SSIS package:
msdb.dbo.sp_add_category #class=N'JOB', #type=N'LOCAL', #name=N'[Uncategorized (Local)]'
msdb.dbo.sp_add_job ...
msdb.dbo.sp_add_jobstep ...
msdb.dbo.sp_update_job ...
msdb.dbo.sp_add_jobschedule ...
msdb.dbo.sp_add_jobserver ...
(You can see an example by right clicking a scheduled job and selecting "Script Job as-> Create To".)
AND you can use sp_start_job to execute the job immediately, effectively running SSIS packages on demand.
Question: does anyone know of any msdb.dbo.[...] stored procedures that simply allow you to run SSIS packages on the fly without using sp_cmdshell directly, or some easier approach?
Well, you don't strictly need the sp_add_category, sp_update_job or sp_add_jobschedule calls. We do an on-demand package execution in our app using SQL Agent with the following call sequence:
- sp_add_job
- sp_add_jobstep
- sp_add_jobserver
- sp_start_job
Getting the job status is a little tricky if you can't access the msdb..sysjobXXX tables, but our jobs start & run just fine.
EDIT:: Other than xp_cmdshell, I'm not aware of another way to launch the the SSIS handlers from withinSQL Server. Anyone with permissions on the server can start the dtexec or dtutil executables; then you can use batch files, job scheduler etc.
Not really... you could try sp_OACreate but it's more complicated and may not do it.
Do you need to run them from SQL? They can be run from command line, .net app etc
In SQL Server 2012+ it is possible to use the following functions (found in the SSISDB database, not the msdb database) to create SSIS execution jobs, prime their parameters, and start their execution:
[catalog].[create_execution]
[catalog].[set_execution_parameter_value]
[catalog].[start_execution]