Gatling: Dynamically assemble HttpCheck for multiple Css selectors - scala

I am working on a Gatling test framework that can be parameterized through external config objects. One use case I have is that there may be zero or more CSS selector checks that need to be saved to variables. In my config object, I've implemented that as a Map[String,(String, String)], where the key is the variable name, and the value is the 2-part css selector.
I am struggling with how to dynamically assemble the check. Here's what I got so far:
val captureMap: Map[String, (String, String)] = config.capture
httpRequestBuilder.check(
captureMap.map((mapping) => {
val varName = mapping._1
val cssSel = mapping._2
css(cssSel._1, cssSel._2).saveAs(varName)
}).toArray: _* // compilation error here
)
The error I'm getting is:
Error:(41, 10) type mismatch;
found : Array[io.gatling.core.check.CheckBuilder[io.gatling.core.check.css.CssCheckType,jodd.lagarto.dom.NodeSelector,String]]
required: Array[_ <: io.gatling.http.check.HttpCheck]
}).toArray: _*
apparently, I need to turn my CheckBuilder into a HttpCheck, so how do I do that?
Update:
I managed to get it to work by introducing a variable of type HttpCheck and returning it in the next line:
httpRequestBuilder.check(
captureMap.map((mapping) => {
val varName = mapping._1
val cssSel = mapping._2
val check:HttpCheck= css(cssSel._1, cssSel._2).saveAs(varName)
check
}).toArray: _*
)
While this works, it's ugly as hell. Can this be improved?

I had the same issue.
I had the following imports:
import io.gatling.core.Predef._
import io.gatling.http.Predef.http
I changed these imports to:
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.http.request.builder.HttpRequestBuilder.toActionBuilder
which made it work.

Related

Creating a DataFrame containing instances of a case class with integer attributes

Spark fails to convert a sequence of such instances to a DataFrame/Dataset, which is not a great developer experience.
Is this a bug or an expected feature?
Consider an example:
import spark.implicits._
case class Test(`9`: Double)
This fails:
val failingTest = Seq(Test(9.0)).toDF()
This works fine:
val successfulTest = Seq(9.0).toDF("9").as[Test]
successfulTest.show()
If you see the prerequisites of declaring a variable and how you name them, then how you are specifying a variable name is not allowed.
The variable name must start with a letter and cannot begin with a number or other characters
Just change your case class to use string variable name without symbol as below and it should work.
import spark.implicits._
case class Test(Id: Double)
val failingTest = Seq(Test(9.0)).toDS()
The above line of code would give you a dataset of Test as below :
failingTest: org.apache.spark.sql.Dataset[Test] = [Id: double]
val failingTest1 = Seq(test).toDF()
The above line would give you a Dataset of Row which is DataFrame as below:
failingTest1: org.apache.spark.sql.DataFrame = [Id: double]
It is giving you error because you are not providing the name in proper format.
You could also consider it as bug as it is not giving you error when you create a case class which I think it should have done.
Your code would also work if you just remove the value 9 and use any character as below:
import spark.implicits._
case class Test1(`i`: Double)
val failingTest = Seq(Test1(9.0)).toDF()

How to use TypeInformation in a generic method using Scala

I'm trying to create a generic method in Apache Flink to parse a DataSet[String](JSON strings) using case classes. I tried to use the TypeInformation like it's mentioned here: https://ci.apache.org/projects/flink/flink-docs-stable/dev/types_serialization.html#generic-methods
I'm using liftweb to parse the JSON string, this is my code:
import net.liftweb.json._
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.scala._
class Loader(settings: Map[String, String])(implicit environment: ExecutionEnvironment) {
val env: ExecutionEnvironment = environment
def load[T: TypeInformation](): DataSet[T] = {
val data: DataSet[String] = env.fromElements(
"""{"name": "name1"}""",
"""{"name": "name2"}"""
)
implicit val formats = DefaultFormats
data.map(item => parse(item).extract[T])
}
}
But I got the error:
No Manifest available for T
data.map(item => parse(item).extract[T])
Then I tried to add a Manifest and delete the TypeInformation like this:
def load[T: Manifest](): DataSet[T] = { ...
And I got the next error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[T]
I'm very confuse about this, I'll really appreciate your help.
Thanks.

Generating import statements with scala macros

I have the following code:
#mymacro #imports
val _1 = { import scala.collection.mutable.ListBuffer }
#mymacro
val _2 = { val yy: ListBuffer[Int] = ListBuffer.empty }
#mymacro is a scala macro that checks if it has been annotated with the #importsannotation. Part of the implementatation is as follows:
case (cc#q"${mods: Modifiers} val $tname: ${tpt: Tree} = ${expr: Tree}") :: Nil =>
if (tname.toString().startsWith("_"))
if (checkImports(mods, expr)) {
q"import scala.collection.mutable.ListBuffer"
}
else
q"{$expr}"
Currently the macro is able to transform the whole val _1 = ... statement to import scala.collection.mutable.ListBuffer (without the {} brackets!) But when the compilation continues, I keep getting the not found: type ListBuffer compilation error. Now I wonder if it is possible to fix this error somehow without having to define the import statement at the top of the file.
I am using the Scala 2.10 macro paradise plugin

Converting to net.liftweb.json.JsonAST.JObject Lift in Scala

I am trying to construct a JSON object from a list where key is "products" and value is List[Product] where Product is a case class.But I am getting error that says "type mismatch; found : (String, List[com.mycompnay.ws.client.Product]) required: net.liftweb.json.JObject (which expands to) net.liftweb.json.JsonAST.JObject".
What I have done so far is as below:
val resultJson:JObject = "products" -> resultList
println(compact(render(resultJson)))
You're looking for decompose (doc). See this answer.
I tested the following code and it worked fine:
import net.liftweb.json._
import net.liftweb.json.JsonDSL._
import net.liftweb.json.Extraction._
implicit val formats = net.liftweb.json.DefaultFormats
case class Product(foo: String)
val resultList: List[Product] = List(Product("bar"), Product("baz"))
val resultJson: JObject = ("products" -> decompose(resultList))
println(compact(render(resultJson)))
Result:
{"products":[{"foo":"bar"},{"foo":"baz"}]}

Struggling with Play.current.configuration.getStringList("mongodb.replicaSetSeeds") Option handling

I have an conf/application.conf setting like
mongodb.replicaSetSeeds = ["bobk-mbp.local:27017","bobk-mbp.local:27018"]
I'm pulling it out in my code like (the actual extraction is a little different, but this is the gist of it)
val replicaSetSeeds = Play.current.configuration.getStringList("mongodb.replicaSetSeeds")
val listOfString: List[String] = replicaSetSeeds.getOrElse(List("localhost"))
but the compiler hates me
type mismatch; found : Object required: List[String]
The signature of getStringList is
def getStringList(path: String): Option[java.util.List[String]]
How do I handle the None case here or is my problem List[String] is not the same as List[java.util.String]?
Give this a shot:
import collection.JavaConversions._
val optList:Option[List[String]] = Play.current.configuration.getStringList("mongodb.replicaSetSeeds").map(_.toList)
val list = optList.getOrElse(List("localhost"))
There are multiple things going on here. First, you need to import the JavaConversions implicits because what's being returned is an Option[java.util.List[String]] and we want that to be a scala List instead. By doing the map(_.toList), I'm forcing the implicit conversion to kick in and get me an Option[List[String]] and from there things are pretty straight forward.
In play 2.5, you need to use dependency injection, the following works well for me:
1) in your class inject Configuration
class Application #Inject()(
configuration: play.api.Configuration
) ...
2) in your method
import scala.collection.JavaConversions._
val optlist = configuration.getStringList("mongodb.replicaSetSeeds").map{_.toList}
val list = optList.getOrElse(List("localhost"))