Cannot implement .extraInfoExtractor in Gatling Script - scala

I am trying to log extra response information in gatling result through, http.extraInfoExtractor. Below is my code, I am failing to execute the same, also mentioned the error. Please help.
Code :
package cloudnative
import scala.concurrent.duration._
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.jdbc.Predef._
class cloudnativems extends Simulation {
val nbUsers = Integer.getInteger("users", 1)
val myRamp = java.lang.Long.getLong("ramp", 0L)
val varPipelineId = sys.env.get("CI_PIPELINE_ID")
println(varPipelineId)
val httpProtocol = http
.baseUrl("https://lXXXXXXXXXX")
.inferHtmlResources()
.contentTypeHeader("application/json")
val scn = scenario("cloudnativems")
.exec(http("account_movement_post")
.post("/XXXXXXX/gatling-poc/demo/movement/account")
.extraInfoExtractor(extraInfo => List(extraInfo.response.statusCode.get))
.body(StringBody("""{
"movementId": "m0001",
"accountId": "a0001",
"amount": 2000,
"movementDate": "2019-02-26T09:34:50.301Z",
"counterparty": "c0001"
}""")))
.exec(http("account_movement_get")
.get("/XXXXXXX/gatling-poc/demo/movement/account/m0001")
)
setUp(scn.inject(atOnceUsers(1)).protocols(httpProtocol))
}
Error
C:\Sopra Project\Tools\gatling-charts-highcharts-bundle-3.0.3\bin>gatling.bat -s cloudnative.cloudnativems
GATLING_HOME is set to "C:\Sopra Project\Tools\gatling-charts-highcharts-bundle-3.0.3"
JAVA = ""C:\Program Files\Java\jdk1.8.0_201\bin\java.exe""
11:56:14.849 [ERROR] i.g.c.ZincCompiler$ - C:\Sopra Project\Tools\gatling-charts-highcharts-bundle-3.0.3\user-files\simulations\cloudnative\cloudnativems.scala:25:8: value extraInfoExtractor is not a member of io.gatling.http.request.builder.HttpRequestBuilder
possible cause: maybe a semicolon is missing before `value extraInfoExtractor'?
.extraInfoExtractor(extraInfo => List(extraInfo.response.statusCode.get))
^
11:56:15.190 [ERROR] i.g.c.ZincCompiler$ - one error found
11:56:15.193 [ERROR] i.g.c.ZincCompiler$ - Compilation crashed
sbt.internal.inc.CompileFailed: null

extraInfoExtractor has been dropped with version 3.0 according to https://gatling.io/docs/current/migration_guides/2.3-to-3.0/:
extraInfoExtractor was dropped as it wasn’t used in any Gatling component
I am not aware of a replacement construct in Gatling 3.0
Greetings,
Matthias

Related

found java.util.Date but required java.sql.Date?

I'm trying to create a function to check if a string is a date. However, the following function got the error.
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.sql._
import scala.util.{Success, Try}
def validateDate(date: String): Boolean = {
val df = new java.text.SimpleDateFormat("yyyyMMdd")
val test = Try[Date](df.parse(date))
test match {
case Success(_) => true
case _ => false
}
}
Error:
[error] C:\Users\user1\IdeaProjects\sqlServer\src\main\scala\main.scala:14: type mismatch;
[error] found : java.util.Date
[error] required: java.sql.Date
[error] val test = Try[Date](df.parse(date))
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 17, 2017 1:19:33 PM
Is there a simpler way to validate if a string is a date without create a function?
The function is used to validate the command line argument.
if (args.length != 2 || validateDate(args(0))) { .... }
Try[Date](df.parse(date)) You are not interested in type here because you ignore it. So simply omit type parameter. Try(df.parse(date)).
Your function could be shorter. Try(df.parse(date)).isSuccess instead pattern matching.
If your environment contains java 8 then use java.time package always.
import scala.util.Try
import java.time.LocalDate
import java.time.format.DateTimeFormatter
// Move creation of formatter out of function to reduce short lived objects allocation.
val df = DateTimeFormatter.ofPattern("yyyy MM dd")
def datebleStr(s: String): Boolean = Try(LocalDate.parse(s,df)).isSuccess
use this: import java.util.Date

Gatling Get web service

I have tried to create a simple Gatling Script mentioned below,
package computerdatabase.advanced
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.jdbc.Predef._
import scala.util.matching.Regex
import scala.concurrent.duration._
class getSampleTest extends Simulation{
val httpProtocol = http
.baseURL("https://xyz.com")
.header("Content-Type","application/json")
.header("Accept"," application/json ")
.header("Accept-Charset","utf-8n")
.acceptLanguageHeader("en-us","en;q=0.5")
.acceptEncodingHeader("gzip", "deflate")
.connection("keep-alive")
val scn = scenario("XYZ")
.group("XYZ Group") {
exec(http("XYZ-PAge").get("/profile/services").check(status.is(200)))
}
setUp(scn.inject(
rampUsersPerSec(1) to(10) during(5),
constantUsersPerSec(10) during(5)
).protocols(httpProtocol))
}
but i am getting an error saying that -->
value header is not a member of io.gatling.http.config.httpProtocolBuilder
may be a semicolon is missing before'value header'
.header("Content-Type","application/json")
No, this is not the compiler error message you get with such code (this is the error you got with the first tentative you posted on the Gatling mailing list).
Here, you get "too many arguments for method acceptLanguageHeader" (and acceptEncodingHeader) as those take only one parameter:
.acceptLanguageHeader("en-us, en;q=0.5")
.acceptEncodingHeader("gzip, deflate")

Get request with Rapture Http

I'm building an API with Rapture in Scala and having trouble resolving an issue with an implicit not being in scope. Here is the output from the error that I'm receiving.
[error] /Users/Petesta/Documents/scala-project/src/main/scala/scala-project/main.scala:35: an implicit TimeSystem is required; please import timeSystems.numeric or timeSystems.javaUtil
[error] Error occurred in an application involving default arguments.
[error] val response = h.get()
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
[error] Total time: 5 s, completed Oct 16, 2014 3:36:10 PM
Here is the code that it is failing on.
def getUser(userName: String) = {
val h = Http / "some_url" / "user" / userName /? Map('key -> "value")
val response = h.get()
}
I'm not sure what to do because I've tried importing both libraries separately and the error is still the same.
I've also added the -Xlog-implicits flag to see if something else is causing the error but no additional information is outputted.
Is there a good resource anywhere with using the rapture-net library for HTTP requests? I couldn't find one except for Jon Pretty's slides at Scala By The Bay. I couldn't figure out a way to pass in a url with query strings into rapture-uri since it expects function invocation to look like this uri"url_dot_domain_with_query_strings".slurp[Char].
Any ideas?
The compilation error is not entirely correct in this case. You need 1 of the 2 imports AND you need to specify a timeout value.
def getUser(userName: String) = {
import timeSystems.numeric
val h = Http / "some_url" / "user" / userName /? Map('key -> "value")
val response = h.get(timeout = 5000L)
}
I don't really know of a good resource on it, but your basic single code line is correct. The biggest problem with the library is really documentation about the imports required. But this is what I found works for me:
def getGoogle() = {
import rapture.codec._
import rapture.io._
import rapture.uri._
import rapture.net._
import encodings.`UTF-8`
uri"http://google.com".slurp[Char]
}

File writing iteratee does not receive EOF for WS.get

I have created a simple iteratee to download a file using WS as explained in this link.
Consider the following snippet:
import java.nio.ByteBuffer
import java.nio.channels.FileChannel
import org.specs2.mutable.Specification
import org.specs2.time.NoTimeConversions
import play.api.libs.Files.TemporaryFile
import play.api.libs.iteratee.{Done, Input, Cont, Iteratee}
import play.api.libs.ws.WS
import scala.concurrent.Await
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
class DummySpec extends Specification with NoTimeConversions {
def fileChannelIteratee(channel: FileChannel): Iteratee[Array[Byte], Unit] = Cont {
case Input.EOF =>
println("Input.EOF")
channel.close()
Done(Unit, Input.EOF)
case Input.El(bytes) =>
println("Input.El")
val buf = ByteBuffer.wrap(bytes)
channel.write(buf)
fileChannelIteratee(channel)
case Input.Empty =>
println("Input.Empty")
fileChannelIteratee(channel)
}
"fileChannelIteratee" should {
"work" in {
val file = TemporaryFile.apply("test").file
val channel = FileChannel.open(file.toPath)
val future = WS.url("http://www.example.com").get(_ => fileChannelIteratee(channel)).map(_.run)
Await.result(future, 10.seconds)
file.length !== 0
}
}
}
Calling .map(_.run) after WS.get seems to have no effect here as the iteratee does not seem to receive Input.EOF. It prevents me from being able to close the channel. This is the output I get:
Input.El
[info] DummySpec
[info]
[info] fileChannelIteratee should
[info] x work (941 ms)
[error] '0' is equal to '0' (DummySpec.scala:37)
[info]
[info]
[info] Total for specification DummySpec
[info] Finished in 948 ms
[info] 1 example, 1 failure, 0 error
What am I doing wrong?
I am using Play Framework 2.2.2.
Thanks in advance.
I was opening the FileChannel in a wrong way. It seems to default to read mode according to this link when no parameters are given.
The exception thrown from channel.write was being swallowed by map operation as the return type of the whole operation is Future[Future[Unit]]. The outer Future is in a successful state even if the internal one fails in this case. flatMap should be used instead.

Modularising scenarios to run in sequence using Gatling

I'm trying to modularise a series of performance tests in Gatling.
Several of the tests execute the same initial path through the pages, so I thought that I could break them down into a series of scenarios, each scenario being a series of shared actions defined in its own file, and then a final Simulation definition that simply executed the specified scenarios one after the other.
What I then need is for my Simulation to run those scenarios in sequence; but I can only find how to run them either concurrently, or with a specified delay between each. Is there any Simulation setup option to run the defined scenarios one after the other without specifying an arbitrary delay?
EDIT
Currently, I have the following set of files:
homepageHeaders.scala
package advanced
object homepageHeaders {
val homepage_headers_1 = Map(
"Accept" -> """text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8""",
"If-Modified-Since" -> """Wed, 20 Mar 2013 15:36:31 +0000""",
"If-None-Match" -> """"1363793791""""
)
}
homepageChain.scala
package advanced
import com.excilys.ebi.gatling.core.Predef._
import com.excilys.ebi.gatling.http.Predef._
import com.excilys.ebi.gatling.jdbc.Predef._
import akka.util.duration._
import homepageHeaders._
object homepageChain {
val homepageChain =
//Homepage
exec(http("homepage")
.get("/")
.headers(homepageHeaders.homepage_headers_1)
)
}
pageHeaders.scala
package advanced
object pageHeaders {
val page_headers_1 = Map(
"Accept" -> """text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"""
)
}
pageChain.scala
package advanced
import com.excilys.ebi.gatling.core.Predef._
import com.excilys.ebi.gatling.http.Predef._
import com.excilys.ebi.gatling.jdbc.Predef._
import akka.util.duration._
import pageHeaders._
object pageChain {
val pageChain =
//Page Menu
exec(http("page request")
.get("/page1")
.headers(pageHeaders.page_headers_1)
)
}
pageSimulation.scala
package advanced
import com.excilys.ebi.gatling.core.Predef._
import com.excilys.ebi.gatling.http.Predef._
import com.excilys.ebi.gatling.jdbc.Predef._
import homepageChain._
import pageChain._
class pageSimulation extends Simulation {
val urlBase = "http://www.mytestsite.com"
val httpConf = httpConfig
.baseURL(urlBase)
.acceptHeader("image/png,image/*;q=0.8,*/*;q=0.5")
.acceptEncodingHeader("gzip, deflate")
.acceptLanguageHeader("en-gb,en;q=0.5")
.userAgentHeader("Mozilla/5.0 (Windows NT 6.1; WOW64; rv:17.0) Gecko/20100101 Firefox/17.0")
val pageScenario = scenario("Bodycare Scenario")
.exec(homepageChain.homepageChain)
.exec(pageChain.pageChain)
setUp(
homepageScenario.users(1).protocolConfig(httpConf)
)
}
The error that I'm getting is:
14:40:50.800 [ERROR] c.e.e.g.a.ZincCompiler$ - /Gatling/user-files/simulations/advanced/pageChain.scala:13: not found: value exec
14:40:50.807 [ERROR] c.e.e.g.a.ZincCompiler$ - exec(http("page request")
14:40:50.808 [ERROR] c.e.e.g.a.ZincCompiler$ - ^
14:40:53.988 [ERROR] c.e.e.g.a.ZincCompiler$ - /Gatling/user-files/simulations/advanced/homepageChain.scala:13: not found: value exec
14:40:53.989 [ERROR] c.e.e.g.a.ZincCompiler$ - exec(http("homepage")
14:40:53.989 [ERROR] c.e.e.g.a.ZincCompiler$ - ^
14:41:17.274 [ERROR] c.e.e.g.a.ZincCompiler$ - two errors found
Exception in thread "main" Compilation failed
Clearly I'm missing something in my definition, but I just don't understand what it is
You can compose chains, not scenarios.
For example:
val login = exec(...)...
val foo = exec(...)...
val bar = exec(...)...
val scn1 = scenario("Scenario1").exec(login).exec(foo)
val scn2 = scenario("Scenario2").exec(login).exec(bar)
Clear?
You can cascade scenarios so that they execute in sequence as follows:
val allScenarios = scenario1.exec(scenario2).exec(scenario3)
Another option can be like this:
object GetAllRunDetails {
val getAllRunDetails = exec( ....)
}
object GetRunIdDetails{
val getRunIdDetails = exec( .... )
}
val scn1 = scenario("Scenario 1")
.exec(GetAllRunDetails.getAllRunDetails)
val scn2 = scenario("Scenario 2")
.exec(GetRunIdDetails.getRunIdDetails)
setUp(scn1.inject(atOnceUsers(1)),
scn2.inject(atOnceUsers(1)));
Thanks to Stephane, he also have given me a solution to create an object of multiple Chains and pass it to a scenario.
val login = exec(...)...
val foo = exec(...)...
val bar = exec(...)...
val scn_inpute = Seq(login, foo, bar)
val scn1 = scenario("Scenario1").exec(scn_inpute)
Since Gatling 3.4 Scenarios in the same simulation can now be executed sequentially with andThen.
setUp(
parent.inject(injectionProfile)
// child1 and child2 will start at the same time when last parent user will terminate
.andThen(
child1.inject(injectionProfile)
// grandChild will start when last child1 user will terminate
.andThen(grandChild.inject(injectionProfile)),
child2.inject(injectionProfile)
)
)
See official documentation.