Gatling Time Extraction From Response Body - scala

I am completely new to Gatling/Scala.
I have a scenario to execute. Here it goes:
-->Change the shift timings of the employees.
For the above, I am able to script/code the flow. However, I have a challenge:
-> I need to extract the "new" time values from the response and check if that matches with the "new" time values being passed through the parameter (csv) file.
Approach/logic: Extract the date values from the response body and compare that with the date value that has been provided in the csv file.
Sample Response:
{
"employeeId":"xxxxxx",
"schedules":
[
{
"date":"2019-11-25",
: : : "schedules":
: : : [
: : : : {
: : : : : "employeeId":"xxxxxx",
: : : : : "laborWeekStartDate":"2019-11-25", //New edited time
: : : : : "laborWeekEndDate":"2019-12-01", //New edited time
: : : : : "schedules":
: : : : : {
: : : : : : "startTime":"2019-11-25T18:15:00.000Z",
: : : : : : "endTime":"2019-11-25T23:45:00.000Z",
: : : : : : "departmentId":xxxxx,
: : : : : : "departmentName":"abc",
: : : : : : "lastModifiedTimestamp":"2019-12-11T09:22:44.000Z",
: : : : : : "breakDetails":
: : : : : : [
: : : : : : : {
: : : : : : : : "startTime":"2019-11-25T21:00:00.000Z",
: : : : : : : : "endTime":"2019-11-25T21:15:00.000Z",
: : : : : : : : "type":"break"
: : : : : : : }
: : : : : : ]
: : : : : }
: : : : }
: : : ]
: : }
Here, in the below, the right-handside values need to be extracted and compared with the values provided in the csv file.
"startTime":"2019-11-25T18:15:00.000Z",
"endTime":"2019-11-25T23:45:00.000Z",
Please help in performing the above. A step-wise detailed explanation would be much appreciated considering I am totally new to this.
Thanks!

Disclaimer: I will provide some useful links that should help you in achieving the task. If you will encounter any problems doing it, just post a new question
In order to get a value from a JSON response, you could use a jsonPath HTTP response body. There is an example here, how value can be extracted and saved using this method :JSON Path Usage for Gatling Tests
Reading values from CSV file is possible using a built-in feeder functionality: CSV feeders .Once you have the feeder added, you can reference a value using ${columnName} There is an example here: Step 03: Use dynamic data with Feeders and Checks.
After this step you have both values in session. Then using scala language, you should be able to compare those values. Getting a value from session happens using session("variableName").as[String]
For example, you could do a String comparision, if you first substring the value from csv. Scala String comparision Another option is like described here, which is really close to your requirement : How to compare responses from http calls in gatling?
Good luck! :)

Related

how to calculate time difference based on 2 fields in mongodb

I am familiar to simple mongodb queries but this one is a bit complex for me. Here, what I am trying to achieve is on the basis of jsonObject.callID and jsonObject.mobile fields I have to calculate time difference of jsonObject.timestamp. For example in below sample documents, jsonObject.callID and mobile will remain same for jsonObject.action start and end. So based on jsonObject.callID and jsonObject.mobile, I have to subtract the jsonObject.timestamp. jsonObject.callId will be same for two interval actions i.e. start and end with their same jsonObject.mobile numbers.
{
"_id" : ObjectId("5df9bc5ee5e7251030535df5"),
"_class" : "com.abc.mongo.docs.IvrMongoLog",
"jsonObject" : {
"mode" : "ivr",
"callID" : "33333",
"callee" : "128",
"action" : "end",
"mobile" : "218924535466",
"timestamp" : "2019-12-18 16:18:12"
}
}
{
"_id" : ObjectId("5df9bc3de5e7251030535df4"),
"_class" : "com.abc.mongo.docs.IvrMongoLog",
"jsonObject" : {
"mode" : "ivr",
"callID" : "33333",
"callee" : "128",
"action" : "start",
"mobile" : "218924535466",
"timestamp" : "2019-12-18 16:12:11"
}
}
So I am trying to achieve a output like below:
{
"callee" : "128",
"mobile" : "218924535466",
"callID" : "33333",
"minutes_of_call" : "6" // difference of "2019-12-18 16:18:12" - "2019-12-18 16:12:11"
}
subsequently I need such results for next documents...
Kindly assist.

And Operator in Criteria not working as expected for nested documents inside aggregation Spring Data Mongo

I am trying to fetch total replies where read values for a replies is true. But I am getting count value as 3 but expected value is 2 (since only two read value is true) through Aggregation function available in Spring Data Mongo. Below is the code which I wrote:
Aggregation sumOfRepliesAgg = newAggregation(match(new Criteria().andOperator(Criteria.where("replies.repliedUserId").is(userProfileId),Criteria.where("replies.read").is(true))),
unwind("replies"), group("replies").count().as("repliesCount"),project("repliesCount"));
AggregationResults<Comments> totalRepliesCount = mongoOps.aggregate(sumOfRepliesAgg, "COMMENTS",Comments.class);
return totalRepliesCount.getMappedResults().size();
Using AND Operator inside Criteria Query and passed two criteria condition but not working as expected. Below is the sample data set:
{
"_id" : ObjectId("5c4ca7c94807e220ac5f7ec2"),
"_class" : "com.forum.api.domain.Comments",
"comment_data" : "logged by karthe99",
"totalReplies" : 2,
"replies" : [
{
"_id" : "b33a429f-b201-449b-962b-d589b7979cf0",
"content" : "dasdsa",
"createdDate" : ISODate("2019-01-26T18:33:10.674Z"),
"repliedToUser" : "#karthe99",
"repliedUserId" : "5bbc305950a1051dac1b1c96",
"read" : false
},
{
"_id" : "b886f8da-2643-4eca-9d8a-53f90777f492",
"content" : "dasda",
"createdDate" : ISODate("2019-01-26T18:33:15.461Z"),
"repliedToUser" : "#karthe50",
"repliedUserId" : "5c4bd8914807e208b8a4212b",
"read" : true
},
{
"_id" : "b56hy4rt-2343-8tgr-988a-c4f90598h492",
"content" : "dasda",
"createdDate" : ISODate("2019-01-26T18:33:15.461Z"),
"repliedToUser" : "#karthe50",
"repliedUserId" : "5c4bd8914807e208b8a4212b",
"read" : true
}
],
"last_modified_by" : "karthe99",
"last_modified_date" : ISODate("2019-01-26T18:32:41.394Z")
}
What is the mistake in the query that I wrote?

Matlab - Convert char to cell

I have this output from urlread function:
// [
{
"id": "22144"
,"t" : "AAPL"
,"e" : "NASDAQ"
,"l" : "148.59"
,"l_fix" : "148.59"
,"l_cur" : "148.59"
,"s": "0"
,"ltt":"1:13PM EDT"
,"lt" : "May 5, 1:13PM EDT"
,"lt_dts" : "2017-05-05T13:13:23Z"
,"c" : "+2.06"
,"c_fix" : "2.06"
,"cp" : "1.41"
,"cp_fix" : "1.41"
,"ccol" : "chg"
,"pcls_fix" : "146.53"
,"eo" : ""
,"delay": ""
,"op" : "146.76"
,"hi" : "148.91"
,"lo" : "146.76"
,"vo" : "-"
,"avvo" : "-"
,"hi52" : "148.91"
,"lo52" : "89.47"
,"mc" : "771.93B"
,"pe" : "17.38"
,"fwpe" : ""
,"beta" : "1.21"
,"eps" : "8.55"
,"shares" : "5.21B"
,"inst_own" : "63%"
,"name" : "Apple Inc."
,"type" : "Company"
}
]
My question is how can I convert this to a two-column cell? Or even better create a structure called AAPL which gives me , for example for AAPL.l the price?
Use jsondecode function to convert JSON format text to a MATLAB struct type. Typically the text would start with a '[' or '{'. You can try code using a simpler subset as below.
jsondecode('{"id": "22144","t" : "AAPL","e" : "NASDAQ","l" : "148.59"}')
This produces a struct with the following fields.
id: '22144'
t: 'AAPL'
e: 'NASDAQ'
l: '148.59'

How to loop and insert data into different tables from JSON

I have a JSON and I need to write the values into different tables. I could get the data from json, but I need to insert the data accordingly. It's like I have a form, the form has n number of sections, each section have n number of steps and each step can have n number of questions. How I can loop this and write into different tables? Basically I need to know how we can find how many sections, steps and questions we have in the JSON. I tried array_length, but not working.
Here is a small sample of my JSON.
{ "functionId" : "2","subFunctionId" : "6","groupId" : "11","formId" : "","formName":"BladeInseption","submittedBy" : "200021669","createdDate" : "2015-08-06",
"updatedBy" : "","updatedDate" : "","comments" : "","formStatusId" :"11","formStatus" :"Draft","formLanguage" : "English","isFormConfigured" : "N","formChange":"Yes",
"sectionLevelChange":"Yes","isActive" : "Y","formVersionNo" : "1.0","formFooterDetails" : "","formHeaderDetails" : "","images" : [
{"imageId" : "","imageTempId" : "","imageTempUrl" : "","imageName" : "","imageUrl" : "","isDeleted" : "","imagesDesc" : ""} ],
"imagesDescLevel" : "","sectionElements" : [{"sectionElement":[{"sectionId" : "","sectionTempId":"sectionId+DDMMHHSSSS","sectionName":"section1",
"sectionChange":"Yes","stepLevelChange":"Yes","sectionLabel" : "","sectionOrder" : "1","outOfScopeSection" : "false",
"punchListSection" : "false","images" : [{"imageId" : "","imageTempId" : "","imageTempUrl" : "","imageName" : "","imageUrl" : "","isDeleted" : "",
"imagesDesc" : ""}],"imagesDescLevel" : "","isDeleted" : "","stepElements" : [{"stepElement":[{"stepId" : "","stepTempId":"stepId+DDMMHHSSSS",
"stepName":"section1step1","stepLabel" : "","stepOrder" : "1","stepChange":"Yes","questionLevelChange":"Yes","images" : [{"imageId" : "",
"imageTempId" : "","imageTempUrl" : "","imageName" : "","imageUrl" : "","isDeleted" : "","imagesDesc" : ""}],"imagesDescLevel" : "","isDeleted" : "",
"questionAnswerElements" : [{"questionAnswerElement":[{"questionId" : "","questionClientUid" : "","questionDescription" : "step1question1",
"questionAccessibility" : "","isPunchListQuestion" : "","questionChange":"Yes","questionOrder" : "1","isDeleted" : "","images" : [{
"imageId" : "","imageTempId" : "","imageTempUrl" : "","imageName" : "","imageUrl" : "","isDeleted" : "","imagesDesc" : ""}],"imagesDescLevel" : "",
"answerId" : "","answerClientUid" : "","elements" :[{"element" :[{"elementId": "2","elementMapId" : "12","clientUid" : "","clientClass" : "","imageTempId" : "",
"imageTempUrl" : "","elementType":"Question","elementOrder" : "1","elementArributuesProp": [{"attributeId" : "1","attributeName" : "","defaultValue" : ""}],
"elementArributuesVal":[{"value1" : "item1"}],"rule" : [{"ruleId" : "1","ruleName" : "Mandatory","formula" : "i>a","formulaData" : "i>50","isDeleted" : "",
...
}
If you know all paths to JSON arrays in your code, can use some special functions appearing in 9.4 such as
SELECT json_array_length('{"array":[{"a":1},{"b":2},{"c":3}]}'::json->'array')
If you need to iterate through JSON array, there is another useful function:
SELECT json_array_elements('{"array":[{"a":1},{"b":2},{"c":3}]}'::json->'array')
SELECT json_array_elements('[{"a":1},{"b":2},{"c":3}]'::json)
or if json is stored in table, lets call
SELECT json_array_elements(tbl.json_value->'array') FROM jsontable AS tbl
It returns a set of json values unwrapped from array ready to processing.
http://www.postgresql.org/docs/9.4/static/functions-json.html
More information about JSON parsing can be found here
How do I query using fields inside the new PostgreSQL JSON datatype?

How to run multi condition check in mongodb update query?

I have the following scenario.
I need to check multiple condition in a single update query.
Sample Data :
"host" : "zigwheels.com",
"lastAccessDate" : "20140819",
"sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"
Here is my requirement :
obj=db.userFrequency.find({"host" : "xyz.com", "sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"});
if(obj!=null){
if(obj.get("lastAccessDate")!= todayDate){
//If page has not visited today, increment count by 1 and update "lastAccessDate"=todayDate
db.userFrequency.update({"host" : "xyz.com","sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"},{$set : {"lastAccessDate"=todayDate},$inc : {"count":1}});
}
else{
// If page visited today{lastAccessDate==todayDate}, no need to update count
}
}else{
//Insert the new Entry
db.userFrequency.update({"host" : "xyz.com","sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"},{$set : {"lastAccessDate"=todayDate},$inc : {"count":1}},{upsert:true});
}
I need to do the above operation in a single query but I have GB's of data.
I tried above like below :
db.userFrequency.update({"host" : "xyz.com","lastAccessDate"!=todayDate,"sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"},{$set : {"lastAccessDate"=todayDate},$inc : {"count":1}});
but it inserts new entry because the above condition requires three checks.
Please suggest a solution for the same.
It's awkward if you store your date as a string. Store dates as date type fields and this is easy:
> db.userFrequency.insert({
"host" : "zigwheels.com",
"lastAccessDate" : ISODate("2014-08-19"),
"sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"
})
> var today = ISODate("2014-08-25")
> db.userFrequency.update({
"host" : "zigwheels.com",
"lastAccessDate" : { "$lt" : today },
"sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39"
},
{
$set : { "lastAccessDate" : today},
$inc : { "count" : 1 }
})
> db.userFrequency.findOne()
{
"_id" : ObjectId("53fbd08...398"),
"host" : "zigwheels.com",
"lastAccessDate" : ISODate("2014-08-25T00:00:00Z"),
"sessionId" : "ff8378ed-ccda-4a75-b24b-4a4bb1153e39",
"count" : 1
}
I used the $lt operator for the date test because users won't access the website from the future...hopefully. Using proper dates also surfaces another important issue- what timezone defines today? Right now, for me, it's tomorrow in Australia...