I'm currently doing that most noble of programming endeavors, writing tests for Json encoding / decoding. I'm using Argonaut.io for Json and Scalatest for my testing framework. Under scalatest, the use of === during assertion verification provides additional information if a failure occurs, while the use of ==simply gives this org.scalatest.exceptions.TestFailedException was thrown.. However, the scala compiler is not happy. Here's the code:
val default = new Broadcast("default", "default", "default")
test("Should parse out network when present") {
val hcursor = testHCursor(jsonPath + "complete-broadcast.json")
val actualNetwork = Parser.BroadcastDecodeJson(hcursor)
.getOr(default)
.network
assert(actualNetwork === "ESPNU")
}
That spews out this:
[info] Compiling 1 Scala source to /home/vagrant/waltercamp/waltercamp-dataservice/target/scala-2.10/test-classes...
[error] /home/vagrant/waltercamp/waltercamp-dataservice/src/test/scala/io/ptx/waltercamp/schedules/BroadcastParserSuite.scala:16: type mismatch;
[error] found : actualNetwork.type (with underlying type String)
[error] required: ?{def ===(x$1: ? >: String("ESPNU")): ?}
[error] Note that implicit conversions are not applicable because they are ambiguous:
[error] both method ToEqualOps in trait ToEqualOps of type [F](v: F)(implicit F0: scalaz.Equal[F])scalaz.syntax.EqualOps[F]
[error] and method convertToEqualizer in trait Assertions of type (left: Any)BroadcastParserSuite.this.Equalizer
[error] are possible conversion functions from actualNetwork.type to ?{def ===(x$1: ? >: String("ESPNU")): ?}
[error] assert(actualNetwork === "ESPNU")
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
The use == however provides a clean compilation and pass. Is there a way provide the compiler a hint as to which conversion, or the conversion order, to use?
I'd go with ScalaTest's version here. One approach would be to apply the conversion explicitly:
assert(convertToEqualizer(actualNetwork) === "ESPNU")
That's kind of unpleasant, though, and involves a lot of repetitious boilerplate if you're using === many times in a file. Another way would be to exclude the Scalaz conversion from the general import:
import scalaz._, Scalaz.{ ToEqualOps => _, _ }
You could also switch to à la carte imports for Scalaz and just be sure you don't pull in ToEqualOps via scala.syntax.equal._. I'll admit I find à la carte imports a pain to maintain sometimes, but if you're not doing much with Scalaz in the test this wouldn't be too bad.
Related
I was wondering why ScalaTest behaves differently compared to Specs2.
Specs2
"" must be equalTo 3
TestSpec.scala:11:26: type mismatch;
[error] found : Int(3)
[error] required: String
[error] "" must be equalTo 3
ScalaTest
3 should === ("r")
[info] Done compiling.
[info] TestTest:
[info] Dummy test
[info] - should fail *** FAILED ***
[info] "" did not equal 3 (PersistentTaskRuntimeTest.scala:21)
ScalaTest by default only fails at runtime, everything is compared as Any-to-Any.
There is the Supersafe plugin to get better checks (or TypeCheckedTripleEquals) but these feel like hacks as Specs2 just uses the scala compiler to require the types of the two values compared to be in a subtype/supertype relationship.
For reference this is the output when using TypeCheckedTripleEquals, mind the hacky CanEqual
TestTest.scala:21:7: types String and Int do not adhere to the type constraint selected for the === and !== operators; the missing implicit parameter is of type org.scalactic.CanEqual[String,Int]
[error] "" should === (3)
[error] ^
So what is the rationale behind this?
Less heavy for the scala compiler?
Less code to write for ScalaTest?
Less implicit magic (cryptic error messages)?
Pushing a commercial compiler plugin?
TypeCheckedTripleEquals is using generalised type constraints, for example, B <:< A in
implicit override def typeCheckedConstraint[A, B](implicit equivalenceOfA: Equivalence[A], ev: B <:< A): A CanEqual B
This is standard Scala functionality to enforce compile-time safety, and is used in many widespread Scala libraries. For example,
import org.scalactic.TypeCheckedTripleEquals
import org.scalatest._
class CompileTimeSafetySpec extends FlatSpec with Matchers with TypeCheckedTripleEquals {
"TypeCheckedTripleEquals" should "provide compile-time safety" in {
3 should === ("r")
}
}
gives compiler error
types Int and String do not adhere to the type constraint selected for the === and !== operators; the missing implicit parameter is of type org.scalactic.CanEqual[Int,String]
[error] 3 should === ("r")
I have been trying to create a Generic Dao over Slick 3.1.1 and it includes a generic filter that competes with JPA's findByExample, see the following files:
GenericDaoImpl.scala Generic level reusable across all Models
UserDao.scala Generic plus customizations for the User model
UserService.scala Wraps the UserDao into more services level functionality
In this last file I try to use the generic filter function to find a user by its registered email, like this:
// this will implicitly exec and wait indefinitely for the
// db.run Future to complete
import dao.ExecHelper._
def findByEmail(email: String): Option[UserRow] = {
userDao.filter(_.email === email).headOption
}
but this produces the compiler error:
[error] /home/bravegag/code/play-authenticate-usage-scala/app/services/UserService.scala:35: value === is not a member of String
[error] userDao.filter(email === _.email).headOption
[error] ^
[error] /home/bravegag/code/play-authenticate-usage-scala/app/services/UserService.scala:35: ambiguous implicit values:
[error] both value BooleanOptionColumnCanBeQueryCondition in object CanBeQueryCondition of type => slick.lifted.CanBeQueryCondition[slick.lifted.Rep[Option[Boolean]]]
[error] and value BooleanCanBeQueryCondition in object CanBeQueryCondition of type => slick.lifted.CanBeQueryCondition[Boolean]
[error] match expected type slick.lifted.CanBeQueryCondition[Nothing]
[error] userDao.filter(email === _.email).headOption
[error] ^
Can anyone advice on how the implicit declaration of the filter function below can be improved to solve this compiler error?
The implementation of the filter function (found in GenericDaoImpl.scala) is:
// T is defined above as T <: Table[E] with IdentifyableTable[PK]
override def filter[C <: Rep[_]](expr: T => C)
(implicit wt: CanBeQueryCondition[C]) : Future[Seq[E]] =
db.run(tableQuery.filter(expr).result)
As far as I can see you are simply lacking you profile API import in UserService.
Just add there this import: import profile.api._ and it should work.
EDIT: BTW I see many people building their own version of base CRUDs for Slick. Did you try some existing thin libraries doing just that e.g. here: https://github.com/VirtusLab/unicorn ? It's not really related to this question but it may be worth to take a look.
I'm cutting my teeth on Akka HTTP by working this example. For the purposes of learning, I converted it to a Maven project. However, I'm getting compilation errors as follows using Akka v2.3.12 and Akka Stream v1.0. The POST DSL fails with similar errors that I'm not posting for brevity. How can I get the example to run?
pathPrefix("ip") {
(get & path(Segment)) { ip =>
complete {
fetchIpInfo(ip).map[ToResponseMarshallable] {
case Right(ipInfo) => ipInfo
case Left(errorMessage) => BadRequest -> errorMessage
}
}
}
[ERROR] found : akka.http.scaladsl.server.Directive[(String,)]
[ERROR] required: ?{def apply: ?}
[ERROR] Note that implicit conversions are not applicable because they are ambiguous:
[ERROR] both method addDirectiveApply in object Directive of type [L](directive: akka.http.scaladsl.server.Directive[L])(implicit hac: akka.http.scaladsl.server.util.ApplyConverter[L])hac.In => akka.http.scaladsl.server.Route
[ERROR] and method fromDirective in object ConjunctionMagnet of type [L, R](other: akka.http.scaladsl.server.Directive[R])(implicit join: akka.http.scaladsl.server.util.TupleOps.Join[L,R])akka.http.scaladsl.server.ConjunctionMagnet[L]{type Out = akka.http.scaladsl.server.Directive[join.Out]}
[ERROR] are possible conversion functions from akka.http.scaladsl.server.Directive[(String,)] to ?{def apply: ?}
[ERROR] (get & path(Segment)) { ip =>
error: akka.http.scaladsl.server.Directive[(String,)] does not take parameters
[ERROR] (get & path(Segment)) { ip =>
Turns out this is due to the deep implicit chain that Spray (and hence akka-http) uses so getting the imports right is crucial. Very few examples show the imports and those that do, use old libraries.
Right now trying to instantiate a new JSONConverter to register Jackson's Scala module.
private def getConverter(implicit m: ClassTag[T]) = {
new JSONConverter[T](classTag[T].runtimeClass, bucketName)
JSONConverter.registerJacksonModule(DefaultScalaModule)
converter
}
The above code sits in a standard Scala trait that looks like trait Writeable[T] { }.
The problem with the above code is that Scala seems to be having a difficult time with Types. Compiler error is:
[error] found : Class[_$1] where type _$1
[error] required: Class[T]
[error] val converter = new JSONConverter[T](classTag[T].runtimeClass, bucketName(clientId))
[error] ^
[error] one error found
Anyone know the source or easy fix of this issue? Thanks!
Update
Although #wingedsubmariner had an answer that allowed this to originally compile, as soon as I went to write more code the issue cascaded further. I'll show an example:
val o = bucketLookup(clientId).fetch(id, classTag[T].runtimeClass).withConverter(converter).withRetrier(DB.retrier).r(DB.N_READ).execute()
At withConverter the compiler throws the same error:
[error] found : com.basho.riak.client.convert.JSONConverter[T]
[error] required: com.basho.riak.client.convert.Converter[_$1] where type _$1
[error] val o = bucketLookup(clientId).fetch(id, classTag[T].runtimeClass).withConverter(converter).withRetrier(DB.retrier).r(DB.N_READ).execute()
I even tried doing the same type casting using converter.asInstanceOf[JSONConverter[T]] but inheritance (JSONConverter<T> extends Converter<T>) seems to cascade the issue. Any ideas here?
runtimeClass is retuning a Class with the wrong type parameter. Try:
new JSONConverter(classTag[T].runtimeClass.asInstanceOf[Class[T]], bucketName(clientId))
import scala.slick.driver.MySQLDriver.simple._
class RichTable[T](tag: Tag, name: String) extends Table[T](tag, name) {
case class QueryExt[B](q: Query[RichTable.this.type, B]) {
def whereEq[C](col: RichTable.this.type => Column[C], c: C) = {
q.filter { fields =>
col(fields) === c
}
}
}
}
Then it complains
[error] /home/jilen/workspace/play-slick/src/main/scala/play/slick/SlickQueryExtension.scala:10: value === is not a member of slick.driver.MySQLDriver.simple.Column[C]
[error] col(fields) === c
[error] ^
[error] /home/jilen/workspace/play-slick/src/main/scala/play/slick/SlickQueryExtension.scala:9: ambiguous implicit values:
[error] both value BooleanColumnCanBeQueryCondition in object CanBeQueryCondition of type => scala.slick.lifted.CanBeQueryCondition[scala.slick.lifted.Column[Boolean]]
[error] and value BooleanOptionColumnCanBeQueryCondition in object CanBeQueryCondition of type => scala.slick.lifted.CanBeQueryCondition[scala.slick.lifted.Column[Option[Boolean]]]
[error] match expected type scala.slick.lifted.CanBeQueryCondition[Nothing]
[error] q.filter { fields =>
[error] ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 0 s, completed Mar 6, 2014 1:21:48 AM
There have been questions about this, but the answers did not work for 2.0
How to parametrize Scala Slick queries by WHERE clause conditions?
Slick doesn't have any information about C, so it doesn't know if it can and how it should map it to a database value and if it can use === on it. So you get a type error. You will have to use Scala's type system to restrict the type to one for which Slick knows how to map it. You can do this by providing a so-called Context Bound, in this case :BaseColumnType.
def whereEq[C:BaseColumnType](col: RichTable.this.type => Column[C], c: C) = {
q.filter { fields =>
col(fields) === c
}
}
BaseColumnType is provided by Slick and using it in this way basically tells the Scala compiler to look for an implicit value of type BaseColumnType[C] in scope, where you call whereEq. Because then it is usually known what C will actually be. Slick comes with BaseColumnType[Int], BaseColumnType[String], etc. so at the call site, the Scala compiler can find one when your C is really an Int or String in that particular call and this way pass the info further to Slick.
Same for LiuTiger's question. abstract class Crud[..., PK:BaseColumnType] should do the trick, a trait doesn't work with context bounds. When implementing an abstract DAO be prepared to face a lot of challenges and get to the edges of your Scala type system skills and learn quite a bit about type inference order, implicit parameters, etc.