Combine two fields of SOAP response to One in PARASOFT SOATest - rest

I would like to combine two fields values of same SOAP response into One. So that I can assert the One filed of SOAP to REST response field.
SOAP Response:
Field 1 = date
Field 2 = Time
Combine SOAP field (1 &2) to Field 3.
Assert SOAP Field 3 to REST response field.
How should I do this?

I think you found the answer on Parasoft's forum but to answer it here let me explain how you can achieve it:
If you extract values from message using XML DataBank you store them in writable column or variable ( or external data source).
Assuming that Field 1 = date and you assigned it to column or variable called date then you can use it later by calling variable/column name, in this case you should use it with special syntax: ${date}
The Field 2 = time assigned to column/variable time can be used with ${time}
Later you can use it and connect/join in following way: ${date}${time}

Related

Mapping Data Flows Dynamic Column Updates

I have a text input source. This has over 100 columns so I won't show all of them here - a cut-down view of the data would be:
CustomerNo
DOB
DOD
Status
01418495
01/02/1940
NULL
1
01418496
01/01/1930
NULL
1
The users want to be able to update/override any of these columns during processing by providing another input text file containing the PK (CustomerNo) and the key/value pairs of the columns to be updated e.g.
CustomerNo
Variable
New Value
01418495
DOB
01/12/1941
01418496
DOD
01/01/2021
01418496
Status
0
Can this data be used to create dynamic columns somehow that update the customer records regardless of the columns they want to update - in the example above this would result in:
CustomerNo
DOB
DOD
Status
01418495
01/02/1941
NULL
1
01418496
01/01/1930
01/01/2021
0
I have looked at the documentation but don't see any examples of how something like this could be achieved? Thanks in advance for any advice.
You would use a technique similar to what I describe in this video: https://www.youtube.com/watch?v=q7W6J-DUuJY. What I've done is created a file with rules that have expressions and then apply those rules dynamically inside of my data flow.
The key to make this work is using the expr() function to dynamically evaluate the expression from the external file.

How to Validate Data issue for fixed length file in Azure Data Factory

I am reading a fixed-width file in mapping Data Flow and loading it to the table. I want to validate the fields, datatype, lengths of the field that I am extracting in the Derived column using substring.
How to Achieve this in ADF
Use a Conditional Split and add a condition for each property of the field that you wish to test for. For data type checking, we literally just landed new isInteger(), isString() ... functions today. The docs are still in the printing press, but you'll find them in the expression builder. For length use length().

Is it possible to Group By a field extracted from JSON input in a Siddhi Query?

I currently have an stream, with 1 string attribute that contains a Json event.
This stream receives different events, which I want to apply Json path expressions so I can use those attributes on filters and functions.
JsonPath extractors work like a charm on filters and selectors, unfortunately, I am not being able to use them for the 'Group By' part.
I am actually doing it in an embedded Siddhi App with siddhi-execution-json extension added manually, but for the discussion, so everybody can
easily check and test it, I will paste an example app that works on WSO2 Stream Processor.
The objective looks like the following App:
#App:name("Group_by_json_attribute")
define stream JsonStream(json string);
#sink(type='log')
define stream LogStream(myField string, count long);
#info(name='query1')
from JsonStream#window.time(10 sec)
select json:getString(json, '$.myField') as myField, count() as count
group by myField having count > 1
insert into LogStream;
and it can accept the following events:
{"myField": "my_value"}
However, this query will raise the error:
Cannot find attribute type as 'myField' does not exist in 'JsonStream'; define stream JsonStream(json string)
I have also tried to use directly the Json extractor at 'Group by':
group by json:getString(json, '$.myField') as myField having count > 1
However the error now is:
mismatched input ':' expecting {',', ORDER, LIMIT, OFFSET, HAVING, INSERT, DELETE, UPDATE, RETURN, OUTPUT}
which seems to not be expecting to use an extension here
I am just wondering, if it is possible to group by attributes not directly defined in the input stream. In this case is a field extracted from a JSON object, but it could be any other function that generates another attribute.
I am also using versions from maven central repository
Siddhi: io.siddhi:siddhi-core:5.0.1
siddhi-execution-json: io.siddhi.extension.execution.json:siddhi-execution-json:2.0.1
(Edit) Clarification
The objective is, to use attributes not directly defined in the Stream, to be used on the Group By.
The reason why is, I currently have an embedded app which defines the whole set of input streams coming from external sources formatted as JSON, and there are also a set of output streams to inform external components when a query matches.
This app allows users to create custom queries on this set of predefined Streams, but they are not able to create Streams by their own.
Many thanks!
It seems we are expecting the group by fields from the query input stream, in this case, JsonStream. Use another query before this for extraction and the aggregation and filtering in the following query,
#App:name("Group_by_json_attribute")
define stream JsonStream(json string);
#sink(type='log')
define stream LogStream(myField string, count long);
#info(name='extract_stream')
from JsonStream
select json:getString(json, '$.myField') as myField
insert into ExtractedStream;
#info(name='query1')
from ExtractedStream#window.time(10 sec)
select myField, count() as count
group by myField
having count > 1
insert into LogStream;

Creating a JSON structure in PDI without blocks

I'm trying to get a simple JSON output value in PDI from a field that was defined in an earlier step.
The field is id_trans, and I want the result to look like {"id_trans":"1A"} when id_trans value is 1A.
However, when using the JSON Output step and setting the json bloc name to empty, I get this: {"":[{"id_trans":"1A"}]}, which is normal given that the JSON Ouptut step generates json blocks, as specified in the doc.
How can I get rid of the bloc ( i.e. []) structure in a simple manner? I thought of using an external python script, but I would rather use steps in PDI.
You can easily do that with another JSON Input step. Just specify your output value from JSON Output step as Select field and under the tab fields, specify a fieldname and data[0] as Path.

How do I prevent sql injection if I want to build a query in parts within the fatfree framework?

I am using the fatfree framework, and on the front-end I am using jQuery datatables plugin with server-side processing. And thus, my server-side controller may or may not receive a variable number of information, for example a variable number of columns to sort on, a variable number of filtering options and so forth. So if I don't receive any request for sorting, I don't need to have a ORDER BY portion in my query. So I want to generate the query string in parts as per certain conditions and join it at the end to get the final query for execution. But if I do it this way, I won't have any data sanitization which is really bad.
Is there a way I can use the frameworks internal sanitization methods to build the query string in parts? Also is there a better/safer way to do this than how I am approaching it?
Just use parameterized queries. They are here to prevent SQL injection.
Two possible syntaxes are allowed:
with question mark placeholders:
$db->exec('SELECT * FROM mytable WHERE username=? AND category=?',
array(1=>'John',2=>34));
with named placeholders:
$db->exec('SELECT * FROM mytable WHERE username=:name AND category=:cat',
array(':name'=>'John',':cat'=>34));
EDIT:
The parameters are here to filter the field values, not the column names, so to answer more specifically to your question:
you must pass filtering values through parameters to avoid SQL injection
you can check if column names are valid by testing them against an array
Here's a quick example:
$columns=array('category','age','weight');//columns available for filtering/sorting
$sql='SELECT * FROM mytable';
$params=array();
//filtering
$ctr=0;
if (isset($_GET['filter']) && is_array($_GET['filter'])
foreach($_GET['filter'] as $col=>$val)
if (in_array($col,$columns,TRUE)) {//test for column name validity
$sql.=($ctr?' AND ':' WHERE ')."$col=?";
$params[$ctr+1]=$val;
$ctr++;
}
//sorting
$ctr=0;
if (isset($_GET['sort']) && is_array($_GET['sort'])
foreach($_GET['sort'] as $col=>$asc)
if (in_array($col,$columns,TRUE)) {//test for column name validity
$sql.=($ctr?',':' ORDER BY ')."$col ".($asc?'ASC':'DESC');
$ctr++;
}
//execution
$db->exec($sql,$params);
NB: if column names contain weird characters or spaces, they must be quoted: $db->quote($col)