Gatling> Json response map with Json File - scala

Hi I want to validate my Json File saved in resources with the Json response via Get method.
My Saved Json File looks like this, which is in an array
[{
"id":"123"
}]
I am unable to map the File with Response.
object Products {
val jsonFetchProductIDs = ElFileBody("abc.json")
val fetchProductIds: HttpRequestBuilder = http("fetch")
.get("endpoint")
.body(jsonFetch)
val products = http("Products")
.get("endpoint")
}
class ProductsTest extends Simulation {
val productIdInfo = exec(
Products.products
.check(status.is(200), jsonPath("$.id").ofType[Map[String,Any]].findAll.saveAs("productsID"))
)
val productIdResult = exec(session => {
val id = session.get("id").asOption[String]
foreach("${productsID}", "id") {
exec(session => {
val idMap = session("id").as[Map[String, Any]]
val allId = idMap(" allId")
session.set("allId", allId)
})
}
session
})
val getproductidscenario1 = scenario("Products ID")
.exec(Login.login)
.exec(EventBus.openSSE)
.exec(Accounts.fetchInitialAccounts)
.pause(10)
.exec(productIdInfo)
.exec(productIdResult)
setUp (
getproductidscenario1.inject(atOnceUsers(1)).protocols(HttpConf.httpConf)
)
}

You can use the command below to compare the result against file content
bodyString.is(ElFileBody("expected_response.json"))

Related

Gatling -- create and add key,value pairs in map dynamically in one scenario and pass that map to next scenario

I have requirement where I need to create a map and add key-value pairs dynamically in one scenario and pass that map to another scenario where I can retrieve the values by key. Can some one please let me know how to implement this.
def createData() = {
feed(customFeeder).exec(http("create dataset").post("/datasets").header("content-type", "application/json")
.body(StringBody("""{ "name": "${name}","description": "create dataset"}"""))
.asJson.check(jsonPath("$.id").saveAs("userId")))
.exec(session => {
val name = session("name").asOption[String]
println("Dataset name ::: "+name.getOrElse("COULD NOT FIND NAME"))
val datasetId = session("userId").as[String].trim
println("Dataset ID retrieved from createDataSet Response ::: "+ datasetId)
val datasetIdList = session("datasetIdList").asOption[List[String]].getOrElse(Nil)
println("Upload Start Time :::"+Calendar.getInstance().getTime)
**// add above datasetId, upload start time to the map**
session.set("datasetIdList", userId :: datasetIdList)
})
}
// File Upload for the datasets in the datasetIdList
def fileUpload() = foreach("${datasetIdList}","datasetId"){
// println("File Upload Start Time::::"+Calendar.getInstance().getTime+" for datasetId ::: ${datasetId}")
exec(http("file upload").post("/compute-metaservice/datasets/${datasetId}/uploadFile")
.formUpload("File","./src/test/resources/data/Scan_good.csv")
.header("content-type","multipart/form-data")
.check(status is 200).check(status.saveAs("uploadStatus")))
.exec(session => {
**// retrieve the upload time by datasetId from map above**
session
})
}
val scn1 = scenario("create multiple datasets and upload").exec(createDatasetsAndUpload()).exec(fileUpload())
setUp(scn1.inject(atOnceUsers(3))).protocols(httpConf)
First step: need define variable for store session
import scala.collection.mutable.{Map => MutableMap}
val sessionVariable: MutableMap[String, Any] = MutableMap()
Then we create two methods which will save and get session in different scenarios:
def passAnotherSession(currentSession: Session, transferVariable: MutableMap[String, Any]) = {
currentSession.setAll(transferVariable)
}
def saveSessionToVariable(currentSession: Session, transferVariable: MutableMap[String, Any]) = {
val deleteSessionKeys = List("gatling.http.ssl.sslContexts", "gatling.http.cache.contentCache")
transferVariable ++= currentSession.removeAll(deleteSessionKeys: _*).attributes
currentSession
}
And use it:
val setUpScenario = scenario("Set-up scenario")
.exec(...)
.exec(saveSessionToVariable(_, sessionVariable))
val mainScenario = scenario("Main scenario")
.exec(...)
.exec(passAnotherSession(_, sessionVariable))
setUp(
setUpScenario.inject(atOnceUsers(1))
.andThen(
scn.inject(atOnceUsers(1))
).protocols(protocol)
)

Function without var in scala

I have a function that takes a string and a case class as input and return string as output.
Different case class gets appended to the list and the final case class is returned which has the list.
I want to do it without using var. The val list would be immutable and no data would be added to it. Is there any other way of doing it in Scala way?
def getResult(eventName: Option[String], content: Content): String = {
var list = List.empty[Json]
val device = Device(
DEVICE_SCHEMA,
data = content.data.device
)
list = list :+ device.asJson
val parser = Parser(
PARSER_SCHEMA,
data = content.data.parser
)
list = list :+ parser.asJson
val res = Result(
RESULT_SCHMEA,
data = list
)
res.asJson.noSpaces
}
Try inlining list creation like so
def getResult(eventName: Option[String], content: Content): String = {
val device = Device(
DEVICE_SCHEMA,
data = content.data.device
)
val parser = Parser(
PARSER_SCHEMA,
data = content.data.parser
)
Result(
RESULT_SCHMEA,
data = List(device.asJson, parser.asJson) // <== inline list creation
).asJson.noSpaces
}
Just some little changes from the previous answer.
You don't need val res and it's preferred to create the list outside Result for easier reading and later debugging:
def getResult(eventName: Option[String], content: Content): String = {
val device = Device(
DEVICE_SCHEMA,
data = content.data.device
)
val parser = Parser(
PARSER_SCHEMA,
data = content.data.parser
)
val jsons = List(device.asJson, parser.asJson)
Result(
RESULT_SCHMEA,
data = jsons
).asJson.noSpaces
}

how can i write the test cases for file Uploading through extractRequestContext in akka http services

Pls suggest here I have an upload service in Akka HTTP micro service it's working fine. now I need to write the test cases for below code
path( "file-upload") {
extractClientIP { ip =>
optionalHeaderValueByName(Constants.AUTH) { auth =>
(post & extractRequestContext) { request =>
extractRequestContext {
ctx => {
implicit val materializer = ctx.materializer
implicit val ec = ctx.executionContext
val currentTime = TimeObject.getCurrentTime()
fileUpload("fileUpload") {
case (fileInfo, fileStream) =>
val localPath = Configuration.excelFilePath
val uniqueidString = "12345"
val filename = uniqueidString + fileInfo.fileName
val sink = FileIO.toPath(Paths.get(localPath) resolve filename)
val writeResult = fileStream.runWith(sink)
onSuccess(writeResult) { result =>
result.status match {
case Success(_) =>
var excelPath = localPath + File.separator + uniqueidString + fileInfo.fileName
var doc_count = itemsExcelParse(excelPath, companyCode, subCompanyId, userId)
val currentTime2 = TimeObject.getCurrentTime()
var upload_time = currentTime2 - currentTime
val resp: JsValue = Json.toJson(doc_count)
complete {
val json: JsValue = Json.obj("status" -> Constants.SUCCESS,
"status_details" -> "null", "upload_details" -> resp)
HttpResponse(status = StatusCodes.OK, entity = HttpEntity(ContentType(MediaTypes.`application/json`), json.toString))
}
case Failure(e) =>
complete {
val json: JsValue = Json.obj("status" -> Constants.ERROR, "status_details" -> Constants.ERROR_445)
HttpResponse(status = StatusCodes.BandwidthLimitExceeded, entity = HttpEntity(ContentType(MediaTypes.`application/json`), json.toString))
}
}
}
}
}
}
}
}
}
}
I have tried test cases but it's not working
" File upload " should "be able to upload file" in {
val p: Path = Paths.get("E:\\Excel\\Tables.xlsx")
val formData = Multipart.FormData.fromPath("fileUpload", ContentTypes.NoContentType, p, 1000)
Post(s"/file-upload", formData) -> route -> check {
println(" File upload - file uploaded successfully")
status shouldBe StatusCodes.OK
responseAs[String] contains "File successfully uploaded"
}
}
I have changed content type also into application/octet-stream. File not uploaded to the server please suggest here how can I write the test case for file uploading.

Looping through Map Spark Scala

Within this code we have two files: athletes.csv that contains names, and twitter.test that contains the tweet message. We want to find name for every single line in the twitter.test that match the name in athletes.csv We applied map function to store the name from athletes.csv and want to iterate all of the name to all of the line in the test file.
object twitterAthlete {
def loadAthleteNames() : Map[String, String] = {
// Handle character encoding issues:
implicit val codec = Codec("UTF-8")
codec.onMalformedInput(CodingErrorAction.REPLACE)
codec.onUnmappableCharacter(CodingErrorAction.REPLACE)
// Create a Map of Ints to Strings, and populate it from u.item.
var athleteInfo:Map[String, String] = Map()
//var movieNames:Map[Int, String] = Map()
val lines = Source.fromFile("../athletes.csv").getLines()
for (line <- lines) {
var fields = line.split(',')
if (fields.length > 1) {
athleteInfo += (fields(1) -> fields(7))
}
}
return athleteInfo
}
def parseLine(line:String): (String)= {
var athleteInfo = loadAthleteNames()
var hello = new String
for((k,v) <- athleteInfo){
if(line.toString().contains(k)){
hello = k
}
}
return (hello)
}
def main(args: Array[String]){
Logger.getLogger("org").setLevel(Level.ERROR)
val sc = new SparkContext("local[*]", "twitterAthlete")
val lines = sc.textFile("../twitter.test")
var athleteInfo = loadAthleteNames()
val splitting = lines.map(x => x.split(";")).map(x => if(x.length == 4 && x(2).length <= 140)x(2))
var hello = new String()
val container = splitting.map(x => for((key,value) <- athleteInfo)if(x.toString().contains(key)){key}).cache
container.collect().foreach(println)
// val mapping = container.map(x => (x,1)).reduceByKey(_+_)
//mapping.collect().foreach(println)
}
}
the first file look like:
id,name,nationality,sex,height........
001,Michael,USA,male,1.96 ...
002,Json,GBR,male,1.76 ....
003,Martin,female,1.73 . ...
the second file look likes:
time, id , tweet .....
12:00, 03043, some message that contain some athletes names , .....
02:00, 03023, some message that contain some athletes names , .....
some thinks like this ...
but i got empty result after running this code, any suggestions is much appreciated
result i got is empty :
()....
()...
()...
but the result that i expected something like:
(name,1)
(other name,1)
You need to use yield to return value to your map
val container = splitting.map(x => for((key,value) <- athleteInfo ; if(x.toString().contains(key)) ) yield (key, 1)).cache
I think you should just start with the simplest option first...
I would use DataFrames so you can use the built-in CSV parsing and leverage Catalyst, Tungsten, etc.
Then you can use the built-in Tokenizer to split the tweets into words, explode, and do a simple join. Depending how big/small the data with athlete names is you'll end up with a more optimized broadcast join and avoid a shuffle.
import org.apache.spark.sql.functions._
import org.apache.spark.ml.feature.Tokenizer
val tweets = spark.read.format("csv").load(...)
val athletes = spark.read.format("csv").load(...)
val tokenizer = new Tokenizer()
tokenizer.setInputCol("tweet")
tokenizer.setOutputCol("words")
val tokenized = tokenizer.transform(tweets)
val exploded = tokenized.withColumn("word", explode('words))
val withAthlete = exploded.join(athletes, 'word === 'name)
withAthlete.select(exploded("id"), 'name).show()

Parse the response string of a request into another method in gatling

I'm trying to parse a response header value (which is a string) of one request into another method or function in gatling. Here is what I tried
val scn = scenario("DeviceAuth")
.feed(csvFeeeder)
.exec(http("Request1")
.post("endpoint")
.headers(headers_0)
.formParam("key", "value")
.check(headerRegex("header","pattern.*)").saveAs("value"))
.check(status.is(401)))
object getHeader{
def authenticationHeader: String = {
val header: String = "${value}"
val s = header.split("")
--so on and so forth--
}
}
So, when I tried to print the header value, it's just printed "${value}.
How can we pass that value into my function?
Please try this solution
val scn = scenario("DeviceAuth")
.feed(csvFeeeder)
.exec(http("Request1")
.post("endpoint")
.headers(headers_0)
.formParam("key", "value")
.check(headerRegex("header","pattern.*)").saveAs(value))
.check(status.is(401)))
object getHeader{
def authenticationHeader: String = {
val header: String = `$value`
val s = header.split("")
--so on and so forth--
}
}