Makumba - MDD functions - makumba

I was looking at the MDD function documentation and wondered if there is the possibility to have some kind of control flow inside MDD functions.
In a type definition there's a type that has a pointer parameter to other entities of its type which might be null. I'd like to be able to iterate through them until this parameter is null while extracting data from another parameter.
Any idea on how to achieve this?

Since the MDD functions are based on HQL you can use most of the expressions available in it.
For the control flow you have a SQL-like CASE statements.
For example:
getSomeData() { CASE WHEN (pointer1 <> nil)
THEN CASE WHEN (pointer1.pointer2 <> nil)
THEN pointer1.potiner2.field3
ELSE pointer1.field2 END
ELSE field1 END }

Related

Can't use Gatling Expression Language inside my own code/functions

I'm creating one report for all simulations in SCALA, in part of them I don't have all attributes and I need to replace that because I'm getting no attribute defined error.
I was trying below code and many many more:
val dbc_communityID = if ("${communityID.exists()}" == true) "${communityID}" else "N/A"
I always have "N/A". I also checked
== 1
== "true"
I spend definitely to much time of that,
Please help
Quoting Gatling official documentation:
This Expression Language only works on String values being passed to Gatling DSL
methods. Such Strings are parsed only once, when the Gatling
simulation is being instantiated.
For example queryParam("latitude", session => "${latitude}") wouldn’t
work because the parameter is not a String, but a function that
returns a String.
Also, queryParam("latitude", "${latitude}".toInt) wouldn’t because the
toInt would happen before passing the parameter to the queryParam
method.
The solution here would be to pass a function:
session => session("latitude").validate[Int].
In short: you can't mix Gatling EL and functions/custom code. If you want to implement some custom logic that deals with Session data, you have to use the Session API, eg:
session => session("communityID").asOption[String].getOrElse("N/A")
if ("${communityID.exists()}" == true)
You are comparing a fixed String with a Boolean so the result will always be false. There is probably a compiler warning about this.
It is possible that you want this, but hard to say without know more about the rest of the code:
if (s"${communityID.exists()}" == "true")
But in that case it is simpler to write
if (communityID.exists())

Anorm: WHERE condition, conditionally

Consider a repository/DAO method like this, which works great:
def countReports(customerId: Long, createdSince: ZonedDateTime) =
DB.withConnection {
implicit c =>
SQL"""SELECT COUNT(*)
FROM report
WHERE customer_id = $customerId
AND created >= $createdSince
""".as(scalar[Int].single)
}
But what if the method is defined with optional parameters:
def countReports(customerId: Option[Long], createdSince: Option[ZonedDateTime])
Point being, if either optional argument is present, use it in filtering the results (as shown above), and otherwise (in case it is None) simply leave out the corresponding WHERE condition.
What's the simplest way to write this method with optional WHERE conditions? As Anorm newbie I was struggling to find an example of this, but I suppose there must be some sensible way to do it (that is, without duplicating the SQL for each combination of present/missing arguments).
Note that the java.time.ZonedDateTime instance maps perfectly and automatically into Postgres timestamptz when used inside the Anorm SQL call. (Trying to extract the WHERE condition as a string, outside SQL, created with normal string interpolation did not work; toString produces a representation not understood by the database.)
Play 2.4.4
One approach is to set up filter clauses such as
val customerClause =
if (customerId.isEmpty) ""
else " and customer_id={customerId}"
then substitute these into you SQL:
SQL(s"""
select count(*)
from report
where true
$customerClause
$createdClause
""")
.on('customerId -> customerId,
'createdSince -> createdSince)
.as(scalar[Int].singleOpt).getOrElse(0)
Using {variable} as opposed to $variable is I think preferable as it reduces the risk of SQL injection attacks where someone potentially calls your method with a malicious string. Anorm doesn't mind if you have additional symbols that aren't referenced in the SQL (i.e. if a clause string is empty). Lastly, depending on the database(?), a count might return no rows, so I use singleOpt rather than single.
I'm curious as to what other answers you receive.
Edit: Anorm interpolation (i.e. SQL"...", an interpolation implementation beyond Scala's s"...", f"..." and raw"...") was introduced to allow the use $variable as equivalent to {variable} with .on. And from Play 2.4, Scala and Anorm interpolation can be mixed using $ for Anorm (SQL parameter/variable) and #$ for Scala (plain string). And indeed this works well, as long as the Scala interpolated string does not contains references to an SQL parameter. The only way, in 2.4.4, I could find to use a variable in an Scala interpolated string when using Anorm interpolation, was:
val limitClause = if (nameFilter="") "" else s"where name>'$nameFilter'"
SQL"select * from tab #$limitClause order by name"
But this is vulnerable to SQL injection (e.g. a string like it's will cause a runtime syntax exception). So, in the case of variables inside interpolated strings, it seems it is necessary to use the "traditional" .on approach with only Scala interpolation:
val limitClause = if (nameFilter="") "" else "where name>{nameFilter}"
SQL(s"select * from tab $limitClause order by name").on('limitClause -> limitClause)
Perhaps in the future Anorm interpolation could be extended to parse the interpolated string for variables?
Edit2: I'm finding there are some tables where the number of attributes that might or might not be included in the query changes from time to time. For these cases I'm defining a context class, e.g. CustomerContext. In this case class there are lazy vals for the different clauses that affect the sql. Callers of the sql method must supply a CustomerContext, and the sql will then have inclusions such as ${context.createdClause} and so on. This helps give a consistency, as I end up using the context in other places (such as total record count for paging, etc.).
Finally got this simpler approach posted by Joel Arnold to work in my example case, also with ZonedDateTime!
def countReports(customerId: Option[Long], createdSince: Option[ZonedDateTime]) =
DB.withConnection {
implicit c =>
SQL( """
SELECT count(*) FROM report
WHERE ({customerId} is null or customer_id = {customerId})
AND ({created}::timestamptz is null or created >= {created})
""")
.on('customerId -> customerId, 'created -> createdSince)
.as(scalar[Int].singleOpt).getOrElse(0)
}
The tricky part is having to use {created}::timestamptz in the null check. As Joel commented, this is needed to work around a PostgreSQL driver issue.
Apparently the cast is needed only for timestamp types, and the simpler way ({customerId} is null) works with everything else. Also, comment if you know whether other databases require something like this, or if this is a Postgres-only peculiarity.
(While wwkudu's approach also works fine, this definitely is cleaner, as you can see comparing them side to side in a full example.)

Best way to implement behavior based in type in scala

In a friendly chat that I was having with a friend during a code review we notice that in the code there was a lot of:
unknownTypeVal match {
case asStr: String => //DO SOMETHING FOR STRING
case asInt: Integer => //DO SOMETHING FOR Integer
case asMyOwnClass: MyOwnClass => //DO SOMETHING FOR MyOwnClass
}
problem that was initially generated by methods that return Any or Option and there is no way to remove that because we are using libraries as XPath, and JSONPath, which return instances of Any or Option for a provided path.
I don't want to get into discussions of "preference", this is not an opinion question, I want to know either by standard defined preferably by Scala, or any other organization of impact, to do this kind of "type checking" in code in a more organized way, we think that this functionality can be reduced to a single function call to a method which contains a map of function and based on "something" (name of the class or something else that I do not know right now) determine how to process such parameter:
process(myAnnonimusVal: Any) = myMapOfFunct(myAnnonimusVal.getClass) //and based on the function that this will return execute such function pasing myAnnonimusVal
what is encouraged to do by Scala devs or Scala community
In principle, match is the cleanest way to execute code conditional on matching the Any type to something else. Any chain of if-else, instanceOf, etc is bound to turn out to be even more cumbersome and less elegant. A possible exception is a case where you know what the actual type is and can act accordingly, where a direct cast might be permissible.
That said, if you find yourself making the same matches many times, you might as well encapsulate the match in order to avoid code repetition. A partial function might be exactly what you have in mind here.

How to use Javascript's for (attr in this) with Coffeescript

In Javascript, the "for (attr in this)" is often dangerous to use... I agree. That's one reason I like Coffeescript. However, I'm programming in Coffeescript and have a case where I need Javascript's "for (attr in this)". Is there a good way to do this in Coffeescript?
What I am doing now is writing a bunch of logic in embedded raw Javascript, such as:
...coffeescript here...
for (attr in this) {
if (stuff here) {
etc
}
}
It'd be nice to use as little Javascript as possible... any suggestions for how I can achieve this and maximize my use of Coffeescript?
Instead of for item in items which iterates through arrays, you can use for attr, value of object, which works more like for in from JS.
for own attr, value of this
if attr == 'foo' && value == 'bar'
console.log 'Found a foobar!'
Compiled: https://gist.github.com/62860f0c07d60320151c
It accepts both the key and the value in the loop, which is very handy. And you can insert the own keyword right after the for in order to enforce an if object.hasOwnProperty(attr) check which should filter out anything from the prototype that you don't want in there.
Squeegy's answer is correct. Let me just amend it by adding that the usual solution to JavaScript's for...in being "dangerous" (by including prototype properties) is to add a hasOwnProperty check. CoffeeScript can do this automatically using the special own keyword:
for own attr of this
...
is equivalent to the JavaScript
for (attr in this) {
if (!Object.prototype.hasOwnProperty(this, attr)) continue;
...
}
When in doubt about whether you should use for...of or for own...of, it's generally safer to use own.
You can use for x in y or for x of y depending on how you want to interpret a list of elements. The newest version of CoffeeScript aims to solve this problem, and you can read about its new use with an issue (that has since been implemented and closed) here on GitHub

workaround for final == and != (equals and not equals) methods in scala DSL

So I'm wrapping bits of the Mechanical Turk API, and you need to specify qualification requirements such as:
Worker_Locale == "US"
Worker_PercentAssignmentsApproved > 95
...
In my code, I'd like to allow the syntax above and have these translated into something like:
QualificationRequirement("00000000000000000071", "LocaleValue.Country", "EqualTo", "US")
QualificationRequirement("000000000000000000L0", "IntegerValue", "GreaterThan", 95)
I can achieve most of what I want by declaring an object like:
object Worker_PercentAssignmentsApproved {
def >(x: Int) = {
QualificationRequirement("000000000000000000L0", "IntegerValue", "GreaterThan", x)
}
}
But I can't do the same thing for the "==" (equals) or "!=" (not equals) methods since they're declared final in AnyRef. Is there a standard workaround for this? Perhaps I should just use "===" and "!==" instead?
(I guess one good answer might be a summary of how a few different scala DSLs have chosen to work around this issue and then I can just do whatever the majority of those do.)
Edit: Note that I'm not trying to actually perform an equality comparison. Instead, I'm trying to observe the comparison operator the user indicated in scala code, save an object based description of that comparison, and give that description to the server. Specifically, the following scala code:
Worker_Locale == "US"
will result in the following parameters being added to my request:
&QualificationRequirement.1.QualificationTypeId=000000000000000000L0
&QualificationRequirement.1.Comparator=EqualTo
&QualificationRequirement.1.LocaleValue.Country=US
So I can't override equals since it returns a Boolean, and I need to return a structure that represents all these parameters.
If you look at the definition of == and != in the scala reference, (§ 12.1), you’ll find that they are defined in terms of eq and equals.
eq is the reference equality and is also final (it is only used to check for null in that case) but you should be able to override equals.
Note that you’ll probably also need to write the hashCode method to ensure
∀ o1, o2 with o1.equals(o2) ⇒ (o1.hashCode.equals(o2.hashCode)).
However, if you need some other return type for your DSL than Boolean or more flexibility in general, you should maybe use ===, as has been done in Squeryl for example.
Here's a little survey of what various DSLs use for this kind of thing.
Liftweb uses === in Javascript expressions:
JsIf(ValById("username") === value.toLowerCase, ...)
Squeryl uses === for SQL expressions:
authors.where(a=> a.lastName === "Pouchkine")
querydsl uses $eq for SQL expressions:
person.firstName $eq "Ben"
Prolog-in-Scala uses === for Prolog expressions:
'Z === 'A
Scalatest uses === to get an Option instead of a Boolean:
assert("hello" === "world")
So I think the consensus is mostly to use ===.
I've been considering a similar problem. I was thinking of creating a DSL for writing domain-specific formulas. The trouble is that users might want to do string manipulation too and you end up with expression like
"some string" + <someDslConstruct>
No matter what you do its going to lex this as a something like
stringToLiteralString("some string" + <someDslConstruct>)
I think the only potential way out of this pit would be to try using macros. In your example perhaps you could have a macro that wraps a scala expression and converts the raw AST into a query? Doing this for arbitrary expressions wouldn't be feasible but if your domain is sufficiently well constrained it might be a workable alternative solution.