Insert record into custom log table - talend

I have a task to insert error messages into custom log table if something wrong happens - invalid credentials, source data in incorrect format. According to this I have questions:
how to catch error messages from components
how to insert message into specific table in DB
thanks in advance

First part -
to catch error messages in Talend we have few components that needs to be used such as - tDie, tLogCatcher. To explain briefly - for your job you have encountered some error at any of your component what needs to be done is you connect that component to tDie through "On ComponentError" or "On SubJobError" else "RunIf" could also be used if you want to specify some condition upon which the job should error out. Now, in tLogCatcher`` enable the option to "Catch tDie" so that all the errors are catch-ed here with relevant log details. You could also select the options to "Catch Java Exception`".
Second part -
Now connect tLogCatcher to your dboutput component (ex: tMSSqlOutput, tOracleOutput etc...) and in there under "Basic settings" you have the option to choose/mention the Table where you want the records from tLogCatcher to be written.

Related

How to capture an Error Message into a file in DataStage

Is it possible to capture the error message/error field into a file in DataStage?
Like if some error occurs in Transformer Stage, then is it possible to capture the error and the field which had the error into a file? As of now, I am able to capture the entire error record into a file but not the error message or just the error field.
Thanks!!!
Basically, no. Certainly there is no generic solution. You could create a rejects link from the Transformer stage, but even that is limited in its capability.
I suspect you would be better served reading the error information from the job log, and processing that.

Weird "data has been changed" issue

I'm experiencing a very weird issue with "data has been changed" errors.
I use ms access as a frontend and postgresql as backend. The backend used to be in ms access and there were no issues, then it was moved to sql server and there were no issues there either. The problem started when I moved to postgresql.
I have a table called Orders and a table called Job. Each order has multiple jobs, I have 2 forms, one parent form for the Order and one Subform for the Jobs (continuous form). I put the subform in a separate tab, first tab contains general order information and the second tab has the Job information. Job is connected Orders using a foreign key called OrderID, Id of Orders is equal to OrderID in Job.
Here is my problem:
I enter some information in the first tab, customer name, dates etc, then move to the second tab, do nothing in the second tab, go back to the first one and change a date. I get "The data has been changed" error
I'm confused as to why this is happening. Now why I call this weird?
First, if I put the subform on the first tab, I can change all fields of Orders just fine. IT's only if I put it on the second tab and, add some info, change tab, then go back and change an already existing value that I get the error
Second, if I make the subform on the second tab Unbound (so no ID - OrderID) connection, I get the SAME error
Third, the "usual" id for "The data has been changed" error is Runtime Error 440. But what I get is Runtime Error: "-2147352567 (80020009)". Searching online for this error didn't help because it can mean a lot of different things, including "The value you entered isn't valid for this field" like here:
Access Run time error - '-2147352567 (80020009)': subform
or many different results for code 80020009 but none for "the data has been changed"
MS access 2016, postgresql 12.4.1
I'm guessing you are using ODBC to connect Access to Postgresql. If so do you have timestamp fields in the data you working with? I have seen the above as the Postgres timestamp can have a higher precision then Access. This means when you go to UPDATE Access uses a truncated version of the timestamp and can't find the record and you get the error. For this and other possible causes see:
https://odbc.postgresql.org/faq.html#6.4
Microsoft Applications

Talend - Stats and Logs - On database - error

I have a job that inserts data from sql server to mysql. I have set the project settings as -
Have checked the check box for - Use statistics(tStatCatcher), Use logs (tLogcatcher), Use volumentrics (tflowmetercatcher)
Have selected 'On Databases'. And put in the table names
(stats_table,logs_table,flowmeter_table) as well. These tables were created before. The schema of these tables were determined using tcreatetable component.
The problem is when I run the job, data is inserted in the stats_table but not in flowmeter_table
My job is as follows
tmssInput -->tmap --> tmysqoutput.
I have not included tstatcatcher,tlogcatcher,tflowmetercatcher. The stats and logs for this job are taken from the project settings.
My question - Why is there no data entered in flowmeter_table? Should I include tStatCatcher , tlogcatcher and tflowmetercatcher explicitly in the job for it to run fine?
I am using TOS
Thanks in advance
Rathi
Using flow meter requires you to manually configure the flows you want to monitor.
On every flow you want to monitor, right-click on the row >parameters>advanced settings>Monitor connection.
Then you should be able to see data in your flow table.
If you are using the project settings , you don't need to add the *Catcher component on your job.
You need to use tstatcatcher,tlogcatcher,tflowmetercatcher composant in the job directly.
The composant have already their schema defined so you jusneed to put a tmap and redirect in the table you want like :
Moreover in order tu use the tlog catcher you need to put some tdie or twarn in your job.

Event Base Drools Rule

I am new to Drools Fusion, I am unable to create a rule for below condition
Read the server log file with (Date, Error message etc...)
If found Event Type: ERROR with Event Message: "Memory Error" have to
trigger some event (as of now SOP)
Another (with in) 1hr it should not trigger event for same Event Message & Event Type (if its found in log file)
After 1hr if it found the same, it has to trigger event
Note: Have to use same date & time specified in log file
Please do the needful for the same.
I'm not sure exactly what you're looking for. I'll respond conceptually. I'm going to assume you are trying to do everything within the drools framework.
For drools to be constantly aware of the server log you will need to be running a stateful knowledge session and constantly inserting new facts into it. These facts would be derived from the server log.
It looks like you want to talk about Events in your model. Make an Event class. For this example, the class should probably have "type" and "message" fields. Presumably you would insert new event objects using code which is constantly getting information from the server log (either reading a file, through REST, or whatever).
In order to do time based logic you can use cron expressions. You can also use Calendar in more recent versions of drools. This is a brief example of doing it with cron.

SSRS: Dropdown is not populated in filter in Report Builder

Whenever I try to apply filter to an attribute, which has ValueSelection= Dropdown, the dropdown is not populated and error message "The requested list could not be retrieved because the query is not valid or a connection could not be made to the data source" is shown instead.
If I set up ValueSelection=List I am getting a different error message:
An attempt has been made to use a semantic query extension associated with the data extension 'SQL' that is not registered for this report server.
(Microsoft.ReportingServices.SemanticQueryEngine)
This happens within BIDS environment and was observed both in SQL 2005 and SQL 2008.
I've already studied articles, which discussed the similiar problem, but neither of them applied to my case. The user account in data source has all necessary rights, data could be retrieved without any problem (for example if i try "Explore data" in data source view). The SQL profiler shows that no query is being sent to SQL Server when there is an attempt to populate dropdown. So nothing is wrong with the query, it is simply never executed.
Your connection is not working. Try to test you connection by trying a simple table and query output.
This will enable you to test the connection before trying anything advanced.
Got this problem and in my case it was caused by wrong connection string in Data Source - instead of just having a SQL Server name like "SOMESQLSERVER_MACHINE" I had for some reason "SOMESQLSERVER_MACHINE.our.corp.domain". It had to be the same, but then I realized that the domain is wrong, after removing it all works like a charm again. That said: it's always good idea to start with detailed checks on your basic settings.
Otherwise this could be a problem with permissions to the folders on Report Manager.