How to capture an Error Message into a file in DataStage - datastage

Is it possible to capture the error message/error field into a file in DataStage?
Like if some error occurs in Transformer Stage, then is it possible to capture the error and the field which had the error into a file? As of now, I am able to capture the entire error record into a file but not the error message or just the error field.
Thanks!!!

Basically, no. Certainly there is no generic solution. You could create a rejects link from the Transformer stage, but even that is limited in its capability.
I suspect you would be better served reading the error information from the job log, and processing that.

Related

Getting Error message while searching for product on Field service app Salesforce

We're installing FSL and have run into a problem with the FSL Mobile App when we go to consume products.
The error message is "Product2.Material_Code__c like '%search term entered%' or Product2.Product_Image__c like '%search term entered%'
ERROR at ROw:1:Column:368 field 'Product_Image__c' can not be filtered in a query call.
In the error message above %search term entered% is whatever we search on not part of the error message.
Product_Image__c is a pre-existing rich text area custom field on the Product2 table.
Does anyone know how this field got into the SOQL and how we can take it out or otherwise resolve this error.
Sshot of error messageError message on product search.
Regards
Anushka

How to avoid Mongo DB NoSQL blind (sleep) injection

While scanning my Application for vulnerability, I have got one high risk error i.e.
Blind MongoDB NoSQL Injection
I have checked what exactly request is sent to database by tool which performed scanning and found while Requesting GET call it had added below line to GET request.
{"$where":"sleep(181000);return 1;"}
Scan received a "Time Out" response, which indicates that the injected "Sleep" command succeeded.
I need help to fix this vulnerability. Can anyone help me out here? I just wanted to understand what I need to add in my code to perform this check before connecting to database?
Thanks,
Anshu
Similar to SQL injection, or any other type of Code Injection, don't copy untrusted content into a string that will be executed as a MongoDB query.
You apparently have some code in your app that naively accepts user input or some other content and runs it as a MongoDB query.
Sorry, it's hard to give a more specific answer, because you haven't shown that code, or described what you intended it to do.
But generally, in every place where you use external content, you have to imagine how it could be misused if the content doesn't contain the format you assume it does.
You must instead validate the content, so it can only be in the format you intend, or else reject the content if it's not in a valid format.

Is there any possibility if the stage variable conversion is failed then capture the data into reject file

We have a stage variable using DateFromDaysSince(Date Column) in datastage transformer. Due to some invalid dates , datastage job is getting failed . We have source with oracle.
When we check the dates in table we didnt find any issue but while transformation is happening job is getting failed
Error: Invalid Date [:000-01-01] used for date_from_days_since type conversion
Is there any possibility to capture those failure records into reject file and make the parallel job run successfull .. ?
Yes it is possible.
You can use the IsValidDate or IsValidTimestamp function to check that - check out the details here
These functions could be used in a Transformer condition to move rows not showing the expected type to move to a reject file (or peek).
When your data is retrieved from a database (as mentioned) the database ensures the datatype already - if the data is stored in the appropriate format. I suggest checking the retrieval method to avoid unnecessary checks or rejects. Different timestamp formats could be an issue.

Insert record into custom log table

I have a task to insert error messages into custom log table if something wrong happens - invalid credentials, source data in incorrect format. According to this I have questions:
how to catch error messages from components
how to insert message into specific table in DB
thanks in advance
First part -
to catch error messages in Talend we have few components that needs to be used such as - tDie, tLogCatcher. To explain briefly - for your job you have encountered some error at any of your component what needs to be done is you connect that component to tDie through "On ComponentError" or "On SubJobError" else "RunIf" could also be used if you want to specify some condition upon which the job should error out. Now, in tLogCatcher`` enable the option to "Catch tDie" so that all the errors are catch-ed here with relevant log details. You could also select the options to "Catch Java Exception`".
Second part -
Now connect tLogCatcher to your dboutput component (ex: tMSSqlOutput, tOracleOutput etc...) and in there under "Basic settings" you have the option to choose/mention the Table where you want the records from tLogCatcher to be written.

What does Template:${message.encodedData} mean in mirth?

I am trying to learn a mirth system with a channel that is pulling from a database for its source and outputting hl7 messages for its destination(s). The SQL query pulls the correct data from the source--but Mirth does not output all of the data in the right spots in the HL7 message. The destinations show that it is outputting Template:${message.encodedData}. What does that mean? Where can I see the template that it using. The destinations don'y have any filters or transformers so I am confused.
message.encodedData is the fully transformed message - after any transformation steps.
The transformer is also where you can specify the output template for how you want the data to look. Simply load up a sample template message in the output template of the transformer (message template tab in the transformer) and then create a series of message builder steps. Your output message will be in the variable tmp, and your sql results will be in the variable msg.
So, if your first column is patientID (Select patientiD as patientID ...), you would create a message builder steps along the lines of
mapped segment: tmp['PID']['PID.3']['PID.3.2']
mapping: msg['patientID'];
I don't have exact syntax in front of me right now, but that's the basic idea.
I think "transformed" is the status of the message right after the transformers are executed and "encoded" message is the status after the message that comes from the transformers is encoded into the specified channel outbound datatype. In some cases those messages will be the same but not in all the cases.
Also, is very difficult to find updated and comprehensive Mirth documentation.