Functions are not appearing while using TFilterRow in Talend - talend

I am using a tFilterRow to avoid empty rows. While trying to use it I am getting only one function value 'absolute value'.
I want to filter values with a length greater than 0.
Why I am not getting any other functions?

As mentioned in the comments, the length function is only available to schema columns that have the String data type.
To filter out any rows that have a null value in a column you can use a tFilterRow but configured so that the column being checked is not equal to null like so:
In the case you are dealing with the primitive int (rather than the Integer class) then the primitive can never be null and instead defaults to 0 so you'll want to set it as not equal to 0 instead.

Related

Laravel8 migration change ENUM to BOOLEAN

I have in an old project a gender column using an ENUM type (0 and 1 values)... yes I know not my smartest idea.
I have to make a change to this table so I want to simply use a boolean type instead of an enum (since I only need 0 and 1 values). But when I try to change my colum this way :
Schema::table('users', function (Blueprint $table) {
$table->boolean('gender')->nullable()->change();
});
The migration works but it changes the values (0 are converted to 1 and 1 are converted to 2)...
I suppose there is some force casting underway but I don't get why it is converted this way.
Any idea how I can convert my enum type to a tinyint(1) without changing my values? I guess I could add a second column with the right type, copy my values into it, drop my first column and rename my newly created column, but that seems complex for something that I thought would be simple.
Thank you for your help.
Edit: After reading some more posts about that I guess that it's using the value index instead of casting the value, that's why my 0 is converted to 1, etc..
I guess I'll have to use a temp column to do it.

Azure Data flow convert null to whitespace/blank

I am using an expression builder in derived column action of Azure data factory. I have an iif statement that that adds objects to a single array of objects based on whether 5 columns are null. Within the iif statement if the object is not null it adds it to the array object and I did not specify an action for when the columns is null. So if the 3 columns have a value then there should be 3 total objects in the array but the issue is for those 2 empty columns they show up as 2 "null" values within the array. I don't want that. I just want to cleanly have only the 3 objects in the array. How can I convert the null values to whitespace or is there a better way to get this done?
I've made a test to conver null value to whitespace successfully.
My source data is a csv file with 6 columns and some columns may contains Null value:
In the dataflow, I'm using Derived Column to convert the Null value.
In the data preview, we can see the Null value was replaced with whitespace/blank
Summary:
So we can use expression iif(isNull(<Column_Name>),'\n',<Column_Name>) to replace the NULL value to a whitespace.

PostgreSql Queries treats Int as string datatypes

I store the following rows in my table ('DataScreen') under a JSONB column ('Results')
{"Id":11,"Product":"Google Chrome","Handle":3091,"Description":"Google Chrome"}
{"Id":111,"Product":"Microsoft Sql","Handle":3092,"Description":"Microsoft Sql"}
{"Id":22,"Product":"Microsoft OneNote","Handle":3093,"Description":"Microsoft OneNote"}
{"Id":222,"Product":"Microsoft OneDrive","Handle":3094,"Description":"Microsoft OneDrive"}
Here, In this JSON objects "Id" amd "Handle" are integer properties and other being string properties.
When I query my table like below
Select Results->>'Id' From DataScreen
order by Results->>'Id' ASC
I get the improper results because PostgreSql treats everything as a text column and hence does the ordering according to the text, and not as integer.
Hence it gives the result as
11,111,22,222
instead of
11,22,111,222.
I don't want to use explicit casting to retrieve like below
Select Results->>'Id' From DataScreen order by CAST(Results->>'Id' AS INT) ASC
because I will not be sure of the datatype of the column due to the fact that JSON structure will be dynamic and the keys and values may change next time. and Hence could happen the same with another JSON that has Integer and string keys.
I want something so that Integers in Json structure of JSONB column are treated as integers only and not as texts (string).
How do I write my query so that Id And Handle are retrieved as Integer Values and not as strings , without explicit casting?
I think your assumtions about the id field don't make sense. You said,
(a) Either id contains integers only or
(b) it contains strings and integers.
I'd say,
If (a) then numerical ordering is correct.
If (b) then lexical ordering is correct.
But if (a) for some time and then (b) then the correct order changes, too. And that doesn't make sense. Imagine:
For the current database you expect the order 11,22,111,222. Then you add a row
{"Id":"aa","Product":"Microsoft OneDrive","Handle":3095,"Description":"Microsoft OneDrive"}
and suddenly the correct order of the other rows changes to 11,111,22,222,aa. That sudden change is what bothers me.
So I would either expect a lexical ordering ab intio, or restrict my id field to integers and use explicit casting.
Every other option I can think of is just not practical. You could, for example, create a custom < and > implementation for your id field which results in 11,111,22,222,aa. ("Order all integers by numerical value and all strings by lexical order and put all integers before the strings").
But that is a lot of work (it involves a custom data type, a custom cast function and a custom operator function) and yields some counterintuitive results, e.g. 11,111,22,222,0a,1a,2a,aa (note the position of 0a and so on. They come after 222).
Hope, that helps ;)
If Id always integer you can cast it in select part and just use ORDER BY 1:
select (Results->>'Id')::int From DataScreen order by 1 ASC

Convert varchar parameter with CSV into column values postgres

I have a postgres query with one input parameter of type varchar.
value of that parameter is used in where clause.
Till now only single value was sent to query but now we need to send multiple values such that they can be used with IN clause.
Earlier
value='abc'.
where data=value.//current usage
now
value='abc,def,ghk'.
where data in (value)//intended usage
I tried many ways i.e. providing value as
value='abc','def','ghk'
Or
value="abc","def","ghk" etc.
But none is working and query is not returning any result though there are some matching data available. If I provide the values directly in IN clause, I am seeing the data.
I think I should somehow split the parameter which is comma separated string into multiple values, but I am not sure how I can do that.
Please note its Postgres DB.
You can try to split input string into an array. Something like that:
where data = ANY(string_to_array('abc,def,ghk',','))

xml2struct with missing/optional values

I am using xml2struct to transform a xml file to a struct. It can be the case that certain optional nodes do not exist in the xml file. I would like to detect this and insert 'NULL' fields into the struct. I understand that there is no NULL in MATLAB. So I wonder what experienced users tend to use to represent missing/optional values.
I am using these data types:
char
logical
double
I tend to have a preference for NaN or blank space ''. Now granted that using NaN for a logical you will change the datatype to double and create a scenario in which you need to test for 1 or 0 instead of True or False.