Invoking Rexx from JCL - jcl

To invoke a Rexx program and pass parameters, IKJEFT01 can be used
// SET PARM1=
// SET PARM2=
//AUDITDS EXEC PGM=IKJEFT01,
// PARM='RXPGM &PARM1 &PARM2'
But PARM supports limited number of characters. Is there any way to invoke a REXX using a JCL and pass parameter containing more characters?
Using SYSTSIN would be a solution, but I want to use symbolic parameters as that of in PARM parameter.

For historic reasons, the PARM field is limited to 100 bytes, however this limit is increased to 32K for LE (Language Environment) enabled applications that are willing to call the CEE3PR2 LE callable service. LE languages would be Assembler (certain caveats apply), and modern versions of COBOL and PL/I. As far as I know, Rexx is not an LE-enabled language.
One place I worked had a generic program that would write whatever was passed in PARM value to a flat file. Ours happened to be Assembler, but it could have been COBOL, PL/I, or Rexx.
See this answer for an example of how it was used.
I suggest you create such a program, if your shop does not already have one (and please do check before writing your own). Syncsort (and perhaps DFSORT) have the capability to write a parm to an output file, so you could also go that route.
Presupposing the capability of writing a parm to a flat file, you could invoke it once for each of your parameters, MODding the result to a flat file. Then read the flat file into your Rexx program, each record representing one of your parameters.
Update: As #BillWoodger points out in a comment, the PARMDD DD can be used...
Use PARMDD specifying the ddname of a data set containing the command
parmstring to be executed if the command parmstring is more than 100
characters in length.
...which obviates the need to read in the parameters one record at a time.
Also, apparently as of z/OS 2.1 you no longer need a program to place your parms into a dataset, you can have them resolved in-stream when the JCL is processed.

There are two methods of invoking a REXX script using IKJEFT01. One is to use parm as you are currently doing, the other is to use the SYSTSIN data set. That's my preferred method and you can continue long parameters to the next line using the + continuation character. For example, below is an example of calling BPXBATCH using a long parameter zFS file name with continuation.
//FORWARD EXEC PGM=IKJEFT01,REGION=0M
//SYSPRINT DD SYSOUT=*
//SYSTSPRT DD SYSOUT=*
//STDOUT DD SYSOUT=*
//STDERR DD SYSOUT=*
//SYSTSIN DD *
BPXBATCH SH sftp -b /u/bigdata/doc/hadoop.sftp -oPort=8022 biadmin#biad+
min

There are 2 options
if you want the REXX to be able to execute TSO commands, use IKJEFT01
you cannot use it as an external cobol/pl1 programs
you can use PGM=IRXJCL to execute REXX program.
under IRXJCL you cannot activate 'ADDRESS TSO" and usewr TSO commands.
BuT you can call it from other high languages.
The problem is thet you cannot return an answer from the REXX to the calling program.
An another problem is that you can call the REXX with only one string parameter.
As an solution to this problem, I called rexx from cobol. and part of the parameter was an address. in REXX I use STORAGE function to put the output to the address

Related

Building ROM images on CP/M

I'm trying to use the venerable M80 and L80 tools on CP/M to build a ROM image. (It's for a CP/M emulator, hence why I'm using CP/M tools.)
Unfortunately L80 seems to be really crude --- AFAICT it just loads each object file at its absolute address, fixes it up, and then dumps everything from 0x0100 up out to disk. This means that object files that are based at addresses outside its own workspace don't appear to work at all (just producing an error message). My ROM has a base address of 0xd000, which is well outside this.
Does anyone know if it's possible to use M80 and L80 to do this, and if so, how? Alternatively can anyone recommend (and point me at!) a CP/M assembler/linker suite that will?
(Note that I'd like to avoid cross compiling, if possible.)
If you're just assembling one file, then you can use M80's .phase directive to have the assembler locate the output.
.phase 0D000h
If you want to build several source files and link them at the end, then you can still use M80 but you'll need DRI's linker LINK.COM, which can be found in http://www.cpm.z80.de/download/pli80_13.zip. The LINK command line to use would be
LINK result=module1,module2,module3[LD000
(The nearest L80 equivalent would, I think, be
L80 /P:D000,module1,module2,module3,result/N/E
but then you have to remove 0xCF00 bytes from the start of the resulting file).
Old question, but this may work for those who are still looking. I checked this out on my Ampro Little Board running 1980 M80/L80 on CP/M 2.2.
You can use the ASEG (absolute) directive in your starting .MAC file, specify 0D000H as the org, and then reference external modules. As long as those external modules don't include DSEG or PSEG directives you should be able to link them all together with 0D000H as the starting address. E.g.
; TEST.MAC
ASEG
ORG 0D000H
public tstart
tstart:
...
call myfoo## ; call routine myfoo in external module foo.rel
...
end tstart
Assemble it:
M80 TEST,=TEST
Link it with foo.rel and use /X on the output to produce a .HEX file (TEST.HEX):
L80 TEST,FOO,TEST/N/X/E
If you examine the resulting .HEX file you should see the starting address is 0D000H.
BTW: If you don't use /X option then L80 with /N/E will make a .COM with with all the code linked using an offset of 0D000H unless you also include a .phase directive. E.g.:
; TEST.MAC
ASEG
ORG 100H
.phase 0D000H
public tstart
tstart:
...
call myfoo## ; call routine myfoo in external module foo.rel
...
end tstart
Link to make a .COM instead of a .HEX:
L80 TEST,FOO,TEST/N/E <== note no '/X'
You can't run it, but you can consider the .COM file is really a .BIN padded to the nearest 128 byte boundary (assuming that your CP/M is using the typical approach of allocating 128 byte blocks). You can confirm the result by doing a DUMP of the .COM file. If the code was very short it may also include leftover pieces of L80 loader code that wasn't overwritten by your code.
Note you can use also the ASEG approach with org 0100H to make a regular CP/M .COM. In that case you don't need to use .phase assuming the start of your code is at 100H.

Multi line replace

I'm trying to perform a search-replace in several JCLs but I need multi-line capabilities, I need to replace a line for several.
Example:
//STEP1 EXEC PGM=DUMY,PARAM=XPTO
transform into
//STEP1 EXEC PGM=WORKS,PARAM=THAT
//SOMEDD DSN=DSN.WITH.SOMETHING
//SYSTIN
SOME MORE PARAMETERS
I looked into file-aid batch processing but it seems to only support STRING replacement without multi-line support.
I thing REXX might do it but I have no knowledge in it.
Any ideas?
There are commercial products that understand JCL syntax and can do this sort of thing. JOB/SCAN is one, I'm sure others in this product space can do it too.
Which is of no help if you don't have such a product, so we're back to your Rexx comment. Yes, you can do this with Rexx, but you're going to be parsing JCL. This can be non-trivial depending on your requirements. Rexx doesn't have regular expression matching, which is what one normally uses when parsing. It can be done, and if you aren't seeking to do anything much more complicated that what you've indicated then it's probably not too difficult for a Rexx programmer - perhaps this is an opportunity to become one. Rexx had, as one of its design goals, to make programming easier.
An alternative would be to use Perl, copying the PDS members to the Unix file system so you can process them, then copying them back when you're done. Presuming you're running a relatively current release of z/OS and your Systems Programmer(s) have installed the z/OS port of Perl, which is a no-cost item.
If you're willing to copy the affected members to the Unix file system, you may be able to do this with awk. I've only dabbled with awk, but it has the advantage of just being there by default, no one would have to install anything (Perl) that isn't already there by default.
Here are the possibilities that I have in my mind:
You can write a simple COBOL program which would search for the required STRING and replace with whatever you want/need to add.
You can also write a REXX EXEC to perform this, which may not need to parse the line of code that gets read. Simple IF condition would do, i suppose.
But here are some challenges you would have and of course are avoidable.
What if some other parameters exist along with what you search for? like
//STEP1 EXEC PGM=DUMY,PARAM=XPTO,PARM1='X'
What if the search string is spanned across more than one line? like
//STEP1 EXEC PGM=DUMY,
// PARAM=XPTO
Here is a simple TSO/ISPF edit macro that will implement your example. Of course this is very crude but serves as an example of how JCL might be edited.
ISREDIT MACRO ()
CONTROL NOFLUSH NOPROMPT LIST CONLIST SYMLIST MSG
ISREDIT CHANGE ' PGM=DUMY' ' PGM=WORKS'
ISREDIT CHANGE 'XPTO' 'THAT'
ISREDIT (ROW1,COL1) = CURSOR
ISREDIT LINE_AFTER &ROW1 = "//SOMEDD DD DSN=DSN.WITH.SOMTHING,DISP=SHR"
SET &ROW1 = &ROW1 + 1
ISREDIT LINE_AFTER &ROW1 = "//SYSTSIN DD *"
SET &ROW1 = &ROW1 + 1
ISREDIT LINE_AFTER &ROW1 = "SOME MORE PARAMETERS"
EXIT CODE(0)

avoiding exploit in perl variable extrapolation from file

I am optimizing a very time/memory consuming program by running it over a dataset and under multiple parameters. For each "run", I have a csv file, "setup.csv" set up with "runNumber","Command" for each run. I then import this into a perl script to read the command for the run number I would like, extrapolate the variables, then execute it on the system via the system command. Should I be worried about the potential for this to be exploited, (I am worried right now)? If so, what can I do to protect our server? My plan now is to change the file permissions of the "setup.csv" to read only and ownership to root, then go in as root whenever I need to append another run to the list.
Thank you very much for your time.
Run your code in taint mode with -T. That will force you to carefully launder your data. Only pass through strings that are ones you are expecting. Do not launder with .*, but rather check against a list of good strings.
Ideally, there a list of known acceptable values, and you validate against that.
Either way, you want to avoid the shell by using the multi-argument form of system or by using IPC::System::Simple's systemx.
If you can't avoid the shell, you must properly convert the text to pass to the command into shell literals.
Even then, you have to be careful of values that start with -. Lots of tools accept -- to denote the end options, allowing other values to be passed safely.
Finally, you might want to make sure the args don't contain the NUL character (\0).
systemx('tool', '--', #args)
Note: Passing arbitrary strings is not possible in Windows. Extra validation is required.

ANTLR, C-styled macro definitions

What is the easiest (or the best) way to implement macro definitions in ANTLR?
What I want is a mechanism similar to the one that is present in C/C++ language:
#define FUNCTION(a, b) a+b
#define PI 3.1415
How and when should I perform replacement?
If you are doing a pre-processor in the style of C, then you will want to do a separate first pass for pre-processing (which is what this term means - a processing pass before your standard lex/parse pass).
Exactly how you want to do the pass is up to you - you can pass your input text to one grammar in antlr, take the result and hand that off to another grammar, etc.
Or you can create separate programs, which are able to take input on stdin and output to stdout, or pass text between pipes, etc.
Once you have that worked out, its a simple matter of looking for your keywords. Check every token that you see against your table of #defines, and if it matches, replace it with the definition that you have. You will also have to be able to parse function parameters, but that shouldn't add too much effort.

Remote Informix 11.5 Command Line Client

Does a command line tool ship with Informix 11.5 similar to SQLCMD for SQL Server?
If yes, how do I connect to a remote server and perform regular SELECT/INSERT/UPDATE queries using it?
As Michal Niklas says, the standard tool provided with IBM Informix Dynamic Server (colloquially IDS or even just Informix) is DB-Access. However, it is distributed only with IDS itself, not with the Informix Client SDK (CSDK) or Informix Connect (I-Connect) products.
If you want to access IDS from a machine that does not have IDS installed, you need either CSDK or I-Connect on the machine, and some other software - perhaps the original (pre-Microsoft by a decade and more) version of SQLCMD. This is what I use - and have used in various versions over the last (cough, splutter, ouch) twenty-two years or so; I wrote it because I didn't like the command line behaviour of a program called isql (part of the product Informix SQL), which was the pre-cursor to DB-Access. (Lot's of history - not too important to you.)
Usage - SQLCMD has more options than you know what to do with. The basics are simple, though:
sqlcmd -d dbname#dbserver -e 'select * from table' -x -f file.sql
This connects to a database called 'dbname' at the database server known as 'dbserver' as specified in the sqlhosts file (normally $INFORMIXDIR/etc/sqlhosts). The '-e' indicates an SQL expression - a select statement; the results will be printed to standard output in a strict format (Informix UNLOAD format), one logical line per record. The '-x' turns on trace mode; the '-f' option means read the named file for further commands. The '.sql' extension is not mandatory (beware: DB-Access requires the '.sql' extension and will add it for you). (Arguments not prefixed by either '-e' or '-f' are interpreted heuristically; if it contains spaces, it is SQL; if it does not, it is a filename.) The '-H' option prints column headings (labels) before a result set; the '-T' option prints the column types (after the headings, before the results). The '-B' option runs in benchmark mode; it turns on trace, prints the statement, the time when the statement started, and times how long it took. (Knowing when the statement started is helpful if the SQL takes many minutes to run - as it can in benchmarking scenarios). There are controls over the output format (including CSV and even variant of XML - but not an XML using namespaces) and date format, and so on. There are 'built-in' commands to redirect input and output and errors; most command line options can also be used in the interpeter, etc. SQLCMD also provides a history mechanism; it saves SQL statements and you can view, edit or rerun them. In conjunction with output redirection, you can save off a list of statements executed, etc.
The only gotcha with SQLCMD is that it is not currently ported to Windows. It did work on Windows once upon about 6 or 7 years ago. Since then, Microsoft's compilers have gotten antsy about non-MS API functions, insisting that even if I ask for them by name (by requesting POSIX functionality), the functions must be prefixed by an underscore, and by deprecating a bunch of functions that can be used safely if you pay attention to what you are doing (but, regrettably, can be abused by those who are not paying attention, and there are more inattentive than attentive coders around, it seems) - I mean functions like 'strcpy()' which can be used perfectly safely if you know the size of the source and destination strings before you call it. It is on my list of things to do - it just hasn't been done yet because it isn't my itch.
There is also another Open Source tool called SQSL that you can consider. It has some advantages over SQLCMD (conditional logic, etc) but I think SQLCMD has some advantages over SQSL.
You could also consider whether Perl + DBI + DBD::Informix + dbish would work for you.
Try DB-Access
...
DB–Access provides a user interface for entering, executing, and debugging Structured Query Language (SQL) statements and Stored Procedure Language (SPL) routines...