The question is about Logical Files (LF) on the AS/400 and views, by IBM i DB2.
I want to know, if there is a way, to programmatically generate an sql statement (or better, directly a view) out of an logical file.
For now, i look up the LF desription (command WRKF ..) and emulate the conditions in the LF.
My knowlege so far, is, that logical files can always be manually(by hand, so to say) converted into SQL selects... but if there is a pitfall, why there can't ever be a programmatic way to do this, i wanna know why.
I'm searching for something like an IBM I method for this, rather to script my own program for that.
Not a programmatic solution but in System i Navigator go to the Schema you are working with and the Views section. Any of these that are logical files you can right-click and select 'Generate SQL'. You can select multiple and it will generate the SQL for all. There is an option to write to a file.
Related
Is there a way to output my SQL result directly to where screen cursor is currently. Example: If i execute "select 1" I want the result 1 to immediately populate where my cursor is (like a form fill). So if i place my cursor in an open notepad file, execute the query, it populates the 1 directly in as though "typed" in. I need to achieves this all through SQL query without C# or others.
Pretty sure many won't approve (if it's even possible) but i do have my reason for wanting to achieve this.
In short: no.
Long: I am sure you can't do this without using other programming languages.
The databases' purpose is to help application developers with storing and effectively retrieving data. So, it's kind of "utility for developers" which doesn't do anything useful without a separate (let's call it "front-end") program that inserts into and requests data from there.
My guess is you need a standalone program that wait for special key combination, asks db for data retrieval, detects active window and the element, and pastes data in there
I would like to change the properties of multiple diagrams together rather than clicking on them one by one. Does anyone know how this can be achieved?
You can use the scripting facility of Enterprise Architect to loop the diagrams you would like to change and update them.
See this section of the manual to get help.
There is a bunch of example scripts included with EA, either from the local scripts, or from the EAScriptLib MDG.
Another source of examples is my Github repository: https://github.com/GeertBellekens/Enterprise-Architect-VBScript-Library
You could write a SQL to manipulate your database. t_diagram.PDATA holds a long cryptic string where one part is ScalePI=0; (which is the default for no scaling). You can alter that to be ScalePI=1; (meaning scale to one page).
String manipulations vary from database to database. So you need to write your own which you can execute in a script using
Repository.Execute("UPDATE t_diagram ...")
Note that you should test this in a sandbox first since invalid SQLs can easily disrupt your whole repository.
Scenario: A computed property needs to available for RAW methods. The IsComputed property set in the model will not work as its value will not be available to RAW methods.
Attempted Solution: Create a computed column directly on the SQL table as opposed to setting the IsComputed property in the model. Specify that CodefluentEntities not overwrite the computed column. I would than expect the BOM to read the computed SQL field no differently than if it was a normal database field.
Problem: I can't figure out how to prevent Codefluent Entities from overwriting the computed column. I attempted to use the production flags as well as setting produce="false" for the property in the .cfp. Neither worked.
Question: Is it possible to prevent Codefluent Entities from overwriting my computed column and if so, how?
The solution youre looking for is here
You can execute whatever custom T-SQL scripts you like, the only premise is to give the script a specific name so the Producer knows when to execute it.
i.e. if you want your custom script to execute after the tables are generated, name your script
after_[ProjectName]_tables.
Save your custom t-sql file alongside the codefluent generated files and build the project.
In my specific case, i had to enable full-text index in one of my table columns, i wrote the SQL script for the functionality, saved it as
`after_[ProjectName]_relations_add`
Heres how they look in my file directory
file directory
Alternate Solution: An alternate solution is to execute the following the TSQL script after the SQL Producer finishes generating.
ALTER TABLE PunchCard DROP COLUMN PunchCard_CompanyCodeCalculated
GO
ALTER TABLE PunchCard
ADD PunchCard_CompanyCodeCalculated AS CASE
WHEN PunchCard_CompanyCodeAdjusted IS NOT NULL THEN PunchCard_CompanyCodeAdjusted
ELSE PunchCard_CompanyCode
END
GO
Additional Configuration Needed to Make Solution Work: In order for this solution to work one must also configure the BOM so that it does not attempt to save the data associated with the computed columns. This can be done through Model using the advanced properties. In my case I selected the CompanyCodeCalculated property. Went to advanced settings. And set the Save setting to False.
Question: Somewhere in the Knowledge Center there is a passing reference on how to automate the execution SQL Scripts after the SQL Producer finishes but I can not find it. Anybody now how this is done?
Post Usage Comments: Just wanted to let people know I implemented this approach and am so far happy with the results.
Tableau is an excellent tool for visualizing data. However, it is designed to be the final stop in a data (ETL) pipeline.
My Tableau workbook uses a bunch of Table Calcs to generate a list of "recommended orders". Rather than view these, I want to automate and execute them. This would make Tableau the engine of a quasi-ML process.
In other words, I would like to make Tableau a part of my ETL pipeline and send data to another tier. How can I write a back-end program that executes my Tableau workbook and receives a results dataset?
See the end of this article for example data I want to automate:
http://robm26.blogspot.com/2015/10/keep-your-factory-humming-with-tableau.html
Any ideas?
You're not not going to like the answer I'm going to give you -- "Don't do this".
Tableau isn't meant to be a task in a larger ETL pipeline and the reason you're having problems making it behave the way you want is it's not meant to be done.
Above and beyond the fact that you've figured out how to get a result that you want in Tableau ("the work is done"), Tableau isn't offering you any real value in the scenario you're describing. Use a tool (like Alteryx) that is really purpose built for this sort of work.
The above answer is correct that tabcmd is the way to pull it out. We use a function in python to generate the tabcmd requests so that they can be batched.
import subprocess
def runTabCmd(cmd):
# run tableau command and display the output
print cmd
if run_tabcmd == 'yes':
p = subprocess.Popen(
cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in p.stdout.readlines():
print line
You probably already knew that, but for us it was a way to completely automate the pulling and loading into another python package like scikit-learn for a streamlined ML solution
I'm editing this answer to agree with Russell's answer. Tableau is not an ETL tool and should not be used as such. If you absolutely have to do something, you can use what I provided. Otherwise, the best practice is to use a tool designed for the job.
You can easily use tabcmd to get the results of a view in CSV, which can be used later in your ETL process. If you need to automate it, you can write a script and execute it with a cron job. I, myself, have a few views that are exported to CSV and used later in my ETL stream to feed our CRM.
Just remember to create the view exactly as you want it to be exported to CSV - usually including the order of the fields. Another tip is that I don't let it use the default "Measure Names" and "Measure Values" - to make sure everything is good on my CSV, I have the fields added manually in the row/columns section.
I am trying to create a trigger that sends an email based on a database event, specifically, when a record is INSERTed in a certain table, I want an email stating that fact to go to the SysAdmin.
I can successfully do the following from a SQL window in iSeries Navigator:
CL:SNDDST TYPE(*LMSG)
TOINTNET(('sysadmin#mycompany.com'))
DSTD('this is the Subject Line')
LONGMSG('This is an Email sent from iSeries box via Navigator')
...and an email gets sent. Which means that the necessary SMTP stuff is there and working.
So all I'm trying to do is encapsulate this code, perhaps with some data changes (e.g. "A record has been added to the XYZ table on whatever-the-sysdate-is"). Navigator has some tantalizing examples that call CL to do some plain-vanilla things, but no clue as to how to make it work in a trigger. I know how to write triggers that do "database stuff", but not this CL stuff. And this is iSeries DB2, so I don't have access to UTL_MAIL.
I know next to nothing about CL, DDS or other iSeries internals... I would prefer not to have to create an external Java program, but will do that as a last resort...but even then, I'm having a hard time finding straightforward examples.
thanks in advance.
First off, note that SNDDST isn't the best choice for internet mail from the IBM i. Basically, SNDDST is a relic from the SNADS networking days that IBM hacked into supporting SMTP emails. There are free alternatives, or if you're reasonably current on fixes for 7.1 then you should have the Send SMTP E-mail (SNDSMTPEMM) command available.
The Run SQL Scripts window of iNav does indeed support CL commands using the CL: prefix. But that's not the same thing as having the query engine itself understand CL.
The CL: prefix isn't going to work inside an SQL trigger.
You could however,use the QCMDEXC stored procedure to call a CL command. But I wouldn't necessarily call that the best option.
The IBM i supports using "external" stored procedures and triggers. Theoretically, you could use a CL program that invokes the SNDSMTPEMM command directly. But given you desires to include data from the table, I wouldn't recommend that approach as you'd be tied to the table structure.
Instead, create your own UTLMAILSND CL program that invokes SNDSMTPEMM. Then defined the UTLMAILSND program as an external stored procedure (you can even give it a longer SQL name of UTIL_MAIL_SEND).
Now you can call your UTIL_MAIL_SEND() procedure from your SQL trigger.
You need to try the SNDSMTPEMM command. It's like sliced bread compared to SNDDST TYPE(*LMSG) It supports HTML too which makes for a lot of fun.
Yes, I used SNDSMPTEMM (skipping the html for now...).
One big note, however: using this command in a CL program doesn't work when being called from SQL. I had to change it to a CLLE program.
So the final answer is as follows: a) an INSERT trigger on the table in question, which calls: b) an (external) PROCEDURE created in the database, which in turn calls: c) the compiled CLLE program object. Works like a charm.
p.s. I create the whole body of the email in the INSERT trigger, and pass it along, eventually to the CLLE program. This allows me to have just this one CLLE program to report on any INSERT/UPDATE/DELETE anywhere in the database.