How to access Library, File, and Field descriptions in DB2? - db2

I would like to write a query that uses the IBM DB2 system tables (ex. SYSIBM) to pull a query that exports the following:
LIBRARY_NAME, LIBRARY_DESC, FILE_NAME, FILE_DESC, FIELD_NAME, FIELD_DESC
I can access the descriptions via the UI, but wanted to generate a dynamic query.
Thanks.

Along with SYSTABLES and SYSCOLUMNS, there is also a SYSSCHEMAS which appears to contain the data you need. Please note that accessing this information through QSYS2 will restrict rows returned to those objects with which you have some access - the SYSIBM schema appears to disregard this (check the reference - for V6R1 it's about page 1267).
You also shouldn't need to retrieve this with a dynamic query - static with host variables (if necessary) will work just fine.

Related

How to filter prometheus series based on the results of another query in grafana dashboard?

I am using Grafana 9.3.1 for monitoring of our system. Among other things, I am trying to monitor the remaining FUP of a phone number for each unit we operate.
Basically, we intend to use two data sources.
Database mapping of the unit ID to its phone number (e.g. "unit_id=123, phone_number="00 123456789")
Prometheus time series remaining_fup{phone_number="00 123456789"}. However, remaining_fup is a 3rd party data and does not include unit_id.
In my unit-detail dashboard I have unit_id variable which indicates which unit FUP should be displayed (among other things depending on unit_id)
My original approach was this:
Create a mixed datasource dashboard
Add database datasource as data A. SELECT phone_number FROM units WHERE unit_id='$unit_id'
Add prometheus datasource remaining_fup and filter it based on A.phone_number: remaining_fup{phone_number="${A.phone_number}"}
Unfortunatelly such use of A isn't supported. I used to hope for applying some transformation like Merge or Join by field and then Filter but with no success. After a lot of googling and trying I feel hopeless.
Could you help please? Is such filter even possible? Thanks!
TL;DR: In grafana dashboard I want to query one datasource in order to obtain a value which I subsequently want to use in another datasource query.
1.) Create variable - name phone_number, type: Query and query your database datasource SELECT phone_number FROM units WHERE unit_id='$unit_id'. You can hide this variable if you don't want it to be visible for the dashboard users.
2.) Variable phone_number may have multiple values, so use advance variable formatting to create valid regex query syntax for your prometheus datasource, e.g.
remaining_fup{phone_number=~"${phone_number:pipe}"}
Of course this queries are just examples and they may need some (syntax) tweaking for the use case. Main idea: don't use 2 queries, but one variable and one query (where you use that variable).

How set parameters in SQL Server table from Copy Data Activity - Source: XML / Sink: SQL Server Table / Mapping: XML column

I have a question, hopefully someone in the forum could give some help here. I am able to pull data from Soap API call to SQL Server table (xml data type field actually) via Copy Data Activity. The pipeline that runs this process is metadata driven, so how could I write other parameters in the same SQL Server table for the same run? I am using a Copy Data Activity to load XML data to SQL Server table but in Mapping tab I am not able to select other parameters in order to point them to others SQL table columns.
In addition, I am using a ForEach Activity in order the Copy Data Activity iterates for several values of one column on SQL Server table.
I will appreciate any advice on this.
Thanks
David
Thank you for your interest, I will try to be more explicit with this image: Hopefully this clarify a little bit. Given the current escenario, how could I pass StoreId and CustomerNumber parameters to the table Stage.XmlDataTable?
Taking in to account in the mapping step I am just able to map XML data from the current API call and then write it into Stage.XmlDataTable - XmlData column.
Thanks in advance David
You can add your parameters using Additional Columns in the Copy data activity Source.
When you import schema in mapping you can see the additional columns added in source.
Refer to this MS document for more details on adding additional columns during the copy.

Prioritise which identifier to use

My crystal report pulls data about books, including an identifier (isbn, issn order number etc.), author, and publisher.
The ID field stores multiple ways to identify the book. The report displays any of the identifiers for that record. If one book has two identifiers; issn and order number, the report currently displays one apparently at random.
How can I make it prioritise which type to use based on a preset order? I figured some sort of filter on the field could work, but I haven't figured out how. I can't edit the table, but I can use SQL within the report.
If all the different types of ID are stored in a single field, your best bet is to use a SQL Command inside your report to separate them into multiple virtual fields.
Go to Database Fields / Database Expert, expand the connection you want to use, and pick Add Command. From here you can write a custom SQL statement to grab the information you're currently using, and at the same time separate the ID field into multiple different fields (as far as the report will be concerned, anyway. The table will stay unchanged.)
The trick is to figure out how to write your command to do the separation. We don't know what your data looks like, so you're on your own from here.
Based on the very little information that you have provided and if i was to make a guess.I suggest you make use of the formula field in your report and then use something like this to accomplish your goal.
IF ISNULL{first_priority_field_name} OR {first_priority_field_name} = '' THEN
{second_priority_field_name}
ELSE
{first_priority_field_name}
Use nested IF statement in case there are more than 2 identifier fields.

Using RESTFul Oracle APEX

I am building a mobile App using Appery.io platform which uses MongoDB -based database. I need to link this DB to Oracle database and use APEX to design an interface such that users can query, update the mobile App DB from Oracle as well as Oracle DB can be updated from the mobile App.
In APEX, I use the URI with GET method:
https://api.appery.io/rest/1/db/collections/Outlet_Details/
And I add the header:
X-Appery-Database-Id
When I run the query in the APEX where I insert the Database-Id, APEX shows the table/collection Outlet_Details in JSON format. However, not the entire table is shown due to, I think, the length of CLOB type.
Now my main problem is I need to query this table/collection called Outlet_Details by a column named: _id. So when I use the following URI:
https://api.appery.io/rest/1/db/collections/Outlet_Details/1234
It returns the specific record that ha _id = 1234. However, I do not want to hardcode it. Instead, I need to have more like where condition such that I can query based on any column value (e.g. userId instead _id). The CURL command is as follows:
curl -X GET
-H "X-Appery-Database-Id: 544a5cdfe4b03d005b6233b9"
-G --data-urlencode 'where={"userId ": "1234"}'
https://api.appery.io/rest/1/db/collections/outlet_details/
My problem is how to insert such a command into APEX, specailly (where) part.
In this tutorial, oracle database is used. Hence using where condition with =:DEP condition, and then bind it to a variable is pretty straightforward. However, I need to replicate this tutorial with my MongoDB.
The other question, which I guess would clarify a lot to me, in the aforementioned tutorial, there is a prefix URI that is by default APEX shema URI. Even when I insert different URI template, the resultant URI will append APEX to the one I inserted. How to build a service there using different URI?
I found that APEX takes where condition as encoded parameter in the URL. Something like:
https://api.appery.io/rest/1/db/collections/Outlet_Details?where=%7B%22Oracle_Flag%22%3A%22Y%22%7D
The header is same and no input parameters.
This can be done from Application builder > New Application > Database > Create Application > Shared Componenets > Create > REST and then start inserting the header, utl .. etc.
You can refer to this link as a reference encoded URL

coldfusion - bind a form to the database

I have a large table which inserts data into the database. The problem is when the user edits the table I have to:
run the query
use lots of lines like value="<cfoutput>getData.firstname#</cfoutput> in the input boxes.
Is there a way to bind the form input boxes to the database via a cfc or cfm file?
Many Thanks,
R
Query objects include the columnList, which is a comma-delimited list of returned columns.
If security and readability aren't an issue, you can always loop over this. However, it basically removes your opportunity to do things like locking certain columns, reduces your ability to do any validation, and means you either just label the form boxes with the column names or you find a way to store labels for each column.
You can then do an insert/update/whatever with them.
I don't recommend this, as it would be nearly impossible to secure, but it might get you where you are going.
If you are using CF 9 you can use the ORM (Object Relation Management) functionality (via CFCs)
as described in this online chapter
https://www.packtpub.com/sites/default/files/0249-chapter-4-ORM-Database-Interaction.pdf
(starting on page 6 of the pdf)
Take a look at <cfgrid>, it will be the easiest if you're editing table and it can fire 1 update per row.
For security against XSS, you should use <input value="#xmlFormat(getData.firstname)#">, minimize # of <cfoutput> tags. XmlFormat() not needed if you use <cfinput>.
If you are looking for an easy way to not have to specify all the column names in the insert query cfinsert will try to map all the form names you submit to the database column names.
http://help.adobe.com/en_US/ColdFusion/9.0/CFMLRef/WSc3ff6d0ea77859461172e0811cbec22c24-7c78.html
This is indeed a very good question. I have no doubt that the answers given so far are helpful. I was faced with the same problem, only my table does not have that many fields though.
Per the docs EntityNew() the syntax shows that you can include the data when instantiating the object:
artistObj = entityNew("Artists",{FirstName="Tom",LastName="Ron"});
instead of having to instantiate and then add the data field by field. In my case all I had to do is:
artistObj = entityNew( "Artists", FORM );
EntitySave( artistObj );
ORMFlush();
NOTE
It does appear from your question that you may be running insert or update queries. When using ORM you do not need to do that. But I may be mistaken.