Talend: configuration dimension time error in tOracleOutput - talend

I still have this problem
Exception in component tOracleOutput_1
java.sql.SQLSyntaxErrorException: ORA-00904: : invalid identifier
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:396)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:951)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:513)

There is some currupt code in your job. What you can do is first check is there any code generated for this job. if not try removing each component/disable and run and see if the error persist or not

I have had this as well. What usually helps is restarting Talend or restarting the computer.
If that doesn't help, there is something wrong with the job. Then I check every schema, every connection, every tMap, every item in the job if there is an error which Talend doesn't show to me.
To check if the code generation system works, you can always click on the Code tab and see if something comes up.
EDIT
An error ORA-00904 comes up. This leads to the suggestion that a column is named wrongly as seen here: https://dba.stackexchange.com/questions/129641/ora-00904-error-while-querying-the-oracle-database-table
To avoid ORA-00904, column names must
begin with a letter.
consist only of alphanumeric and the special characters ($_#); other characters need double quotation marks around them.
be less than or equal to thirty characters.

Related

Why is Access Dlookup Formula Giving Compile Error

so here's my problem. The below works:
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","WorkOrder=331091")
Unfortunately I need 331091 to be Combo4. Once I change the formula to:
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","WorkOrder"= [Combo4]") or
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","WorkOrder"= Combo4) or
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","[WorkOrder]"= [Combo4])
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","[WorkOrder]= [Combo4]")
I've been testing all the variations in Immediate Window and all result in Compile error: Expected: expression. Getting the same error in my other database which is why I created this one. One table, one record along with one unbound form. Table has WorkOrder and CylindersCompleted which are both Number and the form has one Combo and one textbox which are both Number. I'm putting the Dlookup formula in the Control Source of the textbox. I'm hoping someone can help me solve this issue so I can apply it to my other database which is much more complicated. Thanks in advance.
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","WorkOrder"= [Combo4]")
Should be:
=DLookUp("[CylindersCompleted]","WO_User_Input_Save","WorkOrder=" & [Combo4])

Grafana (V7) adding variable in table name

I need to be able to use variables in table names - I basically have the same set of tables used for different types of data, so I would like to just have one dashboard and swapping between all types instead of always having to set up multiple identical dashboards.
My query is something like:
select * from table_$variable_name;
Where my list of possible variable is something like cat, dog, bird
I can seem to make this work, if I only put the variable as shown above I get the following error
Error 1146: Table 'table_$variable_name' doesn't exist
If I enclose it in curly brackets, I get this error instead.
Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '{bird}' at line 1
(i.e. with the selected variable actually being visible this time)
I'm not sure if the issue is having underscores in the table names, I tried putting underscores around my variables too to check and I had no luck with that.
Another thing I tried was gradually adding on to the table name, so e.g.
select * from table_$variable;
Still returns an error, but I can see the table name starting to form correctly
Error 1146: Table 'table_bird_' doesn't exist
However, as soon as I add another underscore, the variable is not picked up abymore
```Error 1146: Table 'table_$variable_' doesn't exist``
I'm sure it's something silly I am missing in the syntax of the query - anyone has any suggestions?
Using this https://grafana.com/docs/grafana/latest/variables/templates-and-variables/ for reference
As #arturomp suggests, use
${var:raw}
At least in my case, this was the solution that worked.
I found double square brackets work. e.g.
Rather than
select * from table_$variable_name;
use
select * from table_[[variable_name]];

Grok filter for a time counter HH:MM

I'm quite new to ELK and Grok-filtering, and I'm struggling with parsing this particular pattern in my grok filter.
I've used the grok debugger to try and solve this, but although I like the tool, I just get confused by the custom patterns.
Eventually, I hope to parse lots of log files sent by filebeat to logstash, then send the parsed logs to elasticsearch and display with kibana or some similar visualization tool.
The lines that I need to parse follow the following pattern:
1310 2017-01-01 16:48:54 [325:51] [326:49] [359:57] Some log info text
The first four digits is a log type identifier, and will be used for grouping. I've called the field "LogLineID".
The date is formatted YYYY-MM-DD HH:MM:SS, and is parsed ok. I called the field "LogDate".
But now the problem begins. Within the square brackets, I have counters, formatted as MM:SS if you like. I cannot for the life of me find a way to sort these out, but I need to compare these times, hence I want to store them as minutes and seconds, not just numbers.
The first is a counter "TimeSpent",
the second is a counter "TimeStarted" and
the third is a counter "TimeSinceDown".
Then, last, comes the info text, which I've managed to grok with simply applying %{GREEDYDATA:LogInfo}.
I notice that the amount of minutes could be far higher than the standard 60 minutes within an hour, so I may be barking up the wrong tree here trying to parse it with date patterns such as TIMESTAMP_ISO8601, but then, I don't really know how else to do this.
So, I came this far:
%{NUMBER:LogLineID} %{TIMESTAMP_ISO8601:LogDate}
and were as mentioned able to (by cutting away the square bracket parts) to parse the log info text with
%{GREEDYDATA:LogInfo}
to create a field LogInfo.
But that's were I'm stuck. Could someone please help me figure out the rest?
Massive thanks in advance.
PS! I also found %{NUMBER:duration}, but it could as far as I could tell only parse timestamps with dot, not colon..
grok regex expression can help you solve the problem.
but first I wanna make sure that do you mean [325:51] [326:49] [359:57] are the three component that you wanna to fetch? And it will returns the result like :
TimeSpent: 325:51
TimeStarted: 326:49
TimeSinceDown: 359:57
were i get the point , you can use my ways in on of the following suggestions:
define your own custom pattern files and add the pattern in your file.
just use the expression in filter part of logstash conf file
hope it will helps you
Ah, there was a space.. Actually, I was misleading myself and everybody in my question, as it was not actually that log line that was causing problems. I just took the first one, not realizing where the problem really were, but the one causing problems had a space within the brackets as such: [ 42:31]. There are also some parts where there are two spaces, so the way I managed to solve this was to include a %{SPACE} between the \[ and the %{NUMBER}:
%{NUMBER:LogLineID} %{TIMESTAMP_ISO8601:LogDate} \[%{SPACE}%{NUMBER:TimeSpentMinutes}\:%{NUMBER:TimeSpentSeconds}\] \[%{SPACE}%{NUMBER:TimeStartedMinutes}\:%{NUMBER:TimeStartedSeconds}\] \[%{SPACE}%{NUMBER:TimeSinceDownMinutes}\:%{NUMBER:TimeSinceDownSeconds}\] %{GREEDYDATA:LogText}
I still haven't solved the merging of minutes and seconds, but this I can also handle in a later stage.
Thanks to Lin Don for showing an interest in my problem, and sorry for not replying sooner.
Hope the solution will help others (or even myself) if their stuck on the same kind of problem.
Note to myself: Read the logs more carefully before grok'ing.. :)

Zabbix Trigger Expression = NULL

i´ve a quick question regarding a zabbix trigger expression. Our script finds out the certification lifetime while returning a number into a file. I already created some triggers for specific time spans (like 10 days or 44 days).
For example:
{Certificate Lifetime:vfs.file.contents[C:\Zabbix_scripts\cert.txt].prev()}<=30 and {Certificate Lifetime:vfs.file.contents[C:\Zabbix_scripts\cert.txt].prev()}>10
I now want to have a trigger which will cause an "Disaster" when the value in the .txt file is unavailable, like the txt file is empty.
I´ve tried
{Certificate Lifetime:vfs.file.contents[C:\Zabbix_scripts\cert.txt].prev()}=0
But this was not the right sulution. Any ideas?
Moreover we have some machines which are equal to others but doesn´t show any value. Any ideas at this point?
Regards
You should be able to detect empty item value with something like this:
strlen()=0

Workflow - Can not select right rows

I have created this workflow by copying a workflow(macro) I recorded from our company system in the hope to relieve me from heavy data entry work. However, when I play the workflow, it failed to select the right row on a list(grid). Instead, it always selects the first row no matter what row I designate. I don't know which programming language the system uses. It was saved using the .P extension. Here is a part that caused the problem. The third last sentence (SELECT_ROW("2")) was not properly executed.
SELECT_TAB("po-proc-tabpage-1").
SET_FIELD("scr-po-num","02.6430").
SELECT_TAB("NONE").
CHOOSE_BUTTON("button-edit").
CHOOSE_BUTTON("button-items").
GET_DIALOG("poitment.p","poitment-004").
SELECT_GRID("br-item").
RESET_GRID_SORT().
SELECT_ROW("2").
CHOOSE_BUTTON("badd-charges").
GET_DIALOG("poitmchg.p","poitmchg-004").
Thanks in advance for any comments.