how to set string array query in vertx and execute - vert.x

I have input like this now i need to set this input which I receive from client side like postman:
how should I pass the users[] value here in tuple
Query("select * from USER where users in ($1);")
how to pass in the tuple??

Related

Pass Column to Function where the column can be empty

I am trying to pass a column to a function where sometimes the column passed can be empty or blank.
For example
def test (df,segment):
score_df = df \
.withColumn('model_segment', when(lit(segment) =='',lit('')).otherwise(col(segment)))
return score_df
This works
test(df,'my_existing_column').show()
However this errors
test(df,'').show()
This errors with the message cannot resolve '``' given input columns
I get why it is doing that, but how would I go about handling this kind of scenario?
You can get a list of dataframe fields by df.columns,then check whether the incoming parameter exists in the field list, if it exists, execute show action, otherwise trigger a custom exception that the field does not exist.

Is it possible to Group By a field extracted from JSON input in a Siddhi Query?

I currently have an stream, with 1 string attribute that contains a Json event.
This stream receives different events, which I want to apply Json path expressions so I can use those attributes on filters and functions.
JsonPath extractors work like a charm on filters and selectors, unfortunately, I am not being able to use them for the 'Group By' part.
I am actually doing it in an embedded Siddhi App with siddhi-execution-json extension added manually, but for the discussion, so everybody can
easily check and test it, I will paste an example app that works on WSO2 Stream Processor.
The objective looks like the following App:
#App:name("Group_by_json_attribute")
define stream JsonStream(json string);
#sink(type='log')
define stream LogStream(myField string, count long);
#info(name='query1')
from JsonStream#window.time(10 sec)
select json:getString(json, '$.myField') as myField, count() as count
group by myField having count > 1
insert into LogStream;
and it can accept the following events:
{"myField": "my_value"}
However, this query will raise the error:
Cannot find attribute type as 'myField' does not exist in 'JsonStream'; define stream JsonStream(json string)
I have also tried to use directly the Json extractor at 'Group by':
group by json:getString(json, '$.myField') as myField having count > 1
However the error now is:
mismatched input ':' expecting {',', ORDER, LIMIT, OFFSET, HAVING, INSERT, DELETE, UPDATE, RETURN, OUTPUT}
which seems to not be expecting to use an extension here
I am just wondering, if it is possible to group by attributes not directly defined in the input stream. In this case is a field extracted from a JSON object, but it could be any other function that generates another attribute.
I am also using versions from maven central repository
Siddhi: io.siddhi:siddhi-core:5.0.1
siddhi-execution-json: io.siddhi.extension.execution.json:siddhi-execution-json:2.0.1
(Edit) Clarification
The objective is, to use attributes not directly defined in the Stream, to be used on the Group By.
The reason why is, I currently have an embedded app which defines the whole set of input streams coming from external sources formatted as JSON, and there are also a set of output streams to inform external components when a query matches.
This app allows users to create custom queries on this set of predefined Streams, but they are not able to create Streams by their own.
Many thanks!
It seems we are expecting the group by fields from the query input stream, in this case, JsonStream. Use another query before this for extraction and the aggregation and filtering in the following query,
#App:name("Group_by_json_attribute")
define stream JsonStream(json string);
#sink(type='log')
define stream LogStream(myField string, count long);
#info(name='extract_stream')
from JsonStream
select json:getString(json, '$.myField') as myField
insert into ExtractedStream;
#info(name='query1')
from ExtractedStream#window.time(10 sec)
select myField, count() as count
group by myField
having count > 1
insert into LogStream;

Combine two fields of SOAP response to One in PARASOFT SOATest

I would like to combine two fields values of same SOAP response into One. So that I can assert the One filed of SOAP to REST response field.
SOAP Response:
Field 1 = date
Field 2 = Time
Combine SOAP field (1 &2) to Field 3.
Assert SOAP Field 3 to REST response field.
How should I do this?
I think you found the answer on Parasoft's forum but to answer it here let me explain how you can achieve it:
If you extract values from message using XML DataBank you store them in writable column or variable ( or external data source).
Assuming that Field 1 = date and you assigned it to column or variable called date then you can use it later by calling variable/column name, in this case you should use it with special syntax: ${date}
The Field 2 = time assigned to column/variable time can be used with ${time}
Later you can use it and connect/join in following way: ${date}${time}

Binding parameters with sequelize fails if multiple

Database is Postgres (and in Sequelize there is support for bind parameter for Postgres).
Strange thing.
When running raw query and binding parameters this way:
return models.sequelize.query(q, {bind: ['33', 'test']}).then(function (data) {
Then sequelize seems to fail in binding parameters.
The query itself is something like
select * from A where id = $1
As soon as I remove the second element in the array passed to bind, the binding works.
But when there are more than one element, the $1 is not transformed to value. This is what I can see in the log.
The query when only a bind parameter is present will print
select * from A where id = 33
While the query when more than one bind parameter is added will print
select * from A where id = $1
My bad. I didn't look at the real error message sent ot the client.
It had nothing to do with the number of parameters.
The problem was that using LIKE I had something like:
and name LIKE '%$2%'
This gives the error:
bind message supplies 2 parameters, but prepared statement requires 1
Which was the real issue.
I realize this is an old thread. However, it may be a workaround for those facing this issue and using postgresql.
let array = ['33', 'test'];
let query = `SELECT * FROM A WHERE id::text IN(SELECT UNNEST(STRING_TO_ARRAY($1, ','))::text)`
// Now we can run the query.
models.sequelize.query(q, {bind: [array.toString()]})

Map tRestClient input with its output in Talend

Is it possible to map the input of a tRestClient to its ouput ? I would like reuse row fields of the request with the row fields of the result and mix them.
You can use tReplicate to get o copy of the input flow, tHashOutput and tHashInput to store and retrieve the responce of the tRestClient: