I would like to have java.sql.Date and Option[java.sql.Date] in my Play-scala project as a query-paramater, which don't come as a default with the Play framework. Play-version I'm using is 2.4.3. I have following (rough) class.
object CustomBinders extends {
val dateFormat = ISODateTimeFormat.date()
implicit def dateBinder: QueryStringBindable[Date] = new QueryStringBindable[Date] {
def bind(key: String, params: Map[String, Seq[String]]): Option[Either[String, Date]] = {
val dateString: Option[Seq[String]] = params.get(key)
try {
Some(Right(new Date(dateFormat.parseDateTime(dateString.get.head).getMillis)))
} catch {
case e: IllegalArgumentException => Option(Left(dateString.get.head))
}
}
def unbind(key: String, value: Date): String = {
dateFormat.print(value.getTime)
}
}
}
Then in Build.scala I have
import play.sbt.routes.RoutesKeys
object Build extends Build {
RoutesKeys.routesImport += "binders.CustomBinders.dateBinder"
RoutesKeys.routesImport += "binders.CustomBinders.optionDateBinder"
However if I define a query-parameter with Option[Date] for an example, I'm getting an error
No QueryString binder found for type Option[java.sql.Date]. Try to implement an implicit QueryStringBindable for this type.
So it obviously isn't the scope. How should I define the Binders so that they exist in the scope? I can't find the 2.4-documentation for this, but 2.5-documentation doesn't say anything about needing to add them to Build.scala
So appereantly the Build.scala wasn't the right place... Even though some documentations tell to put it there. When in build.sbt
routesImport += "binders.CustomBinders._"
The project compiles just fine. Fixed some faults in the original post for the Binder as well.
Related
I'm looking for an option to retain a generic type in runtime in Scala3. In Scala2 there was a TypeTag for this, however, now it is removed and the suggested option is to use macros (https://contributors.scala-lang.org/t/scala-3-and-reflection/3627).
The documentation, however, is somewhat cryptic...
This is what I'm trying to do:
Here's a macro implementation:
object TestMacroImpl {
def getClassImpl[T](using Quotes)(using t: Type[T]): Expr[Class[T]] = '{
classOf[${t}]
}
}
Here's a macro:
import macros.TestMacro.getClassMacro
class TypedBox[T] {
val staticClass: Class[T] = TypedBox.getStaticClass[T]
}
object TypedBox {
inline def getStaticClass[T] = ${ getClassMacro[T] }
}
Test:
object Test {
def main(args: Array[String]): Unit = {
val stringBox = TypedBox[String]()
println(stringBox.staticClass)
}
}
I would envision this to be resolved as val staticClass = classOf[String]
But this does not compile, I'm getting:
/workspace/macros-test/src/main/scala/macros/TestMacro.scala:7:13
t.Underlying is not a class type
classOf[${t}]
What am I missing?
Not really sure why but I don't think you can reliably get an Expr[Class[T]] out of macros (from what I understood, it could be that the Class does not yet exist at the time of macro execution).
Plus, a Class[T] does not retain the parameterized types: classOf[Map [String, String]] = classOf[Map[Int, Int]] for instance.
If you don't care about them, I'd use a ClassTag instead of TypeTag which is still available in Scala 3. And no need for macros.
By the way, in macros, you can write something like the following to get a Expr[ClassTag[T]]:
private def getClassTag[T](using Type[T], Quotes): Expr[ClassTag[T]] = {
import quotes.reflect._
Expr.summon[ClassTag[T]] match {
case Some(ct) =>
ct
case None =>
report.error(
s"Unable to find a ClassTag for type ${Type.show[T]}",
Position.ofMacroExpansion
)
throw new Exception("Error when applying macro")
}
}
Finally, you might find some useful things at https://github.com/gaeljw/typetrees/blob/main/src/main/scala/io/github/gaeljw/typetrees/TypeTreeTagMacros.scala#L8 (disclaimer: I wrote it for personal projects).
I am trying to write a custom codec to convert Cassandra columns of type timestamp to org.joda.time.DateTime.
I am building my project with sbt versions 0.13.13.
I wrote a test that serializes and deserializes a DateTime object. When I run the test via the command line with sbt "test:testOnly *DateTimeCodecTest", the project builds and the test passes.
However, if I try to build the project inside Intellij, I receive the following error:
Error:(17, 22) overloaded method constructor TypeCodec with alternatives:
(x$1: com.datastax.driver.core.DataType,x$2: shade.com.datastax.spark.connector.google.common.reflect.TypeToken[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime] <and>
(x$1: com.datastax.driver.core.DataType,x$2: Class[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime]
cannot be applied to (com.datastax.driver.core.DataType, com.google.common.reflect.TypeToken[org.joda.time.DateTime])
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
Here is the codec:
import java.nio.ByteBuffer
import com.datastax.driver.core.exceptions.InvalidTypeException
import com.datastax.driver.core.{ DataType, ProtocolVersion, TypeCodec }
import com.google.common.reflect.TypeToken
import org.joda.time.{ DateTime, DateTimeZone }
/**
* Provides serialization between Cassandra types and org.joda.time.DateTime
*
* Reference for writing custom codecs in Scala:
* https://www.datastax.com/dev/blog/writing-scala-codecs-for-the-java-driver
*/
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
override def serialize(value: DateTime, protocolVersion: ProtocolVersion): ByteBuffer = {
if (value == null) return null
val millis: Long = value.getMillis
TypeCodec.bigint().serializeNoBoxing(millis, protocolVersion)
}
override def deserialize(bytes: ByteBuffer, protocolVersion: ProtocolVersion): DateTime = {
val millis: Long = TypeCodec.bigint().deserializeNoBoxing(bytes, protocolVersion)
new DateTime(millis).withZone(DateTimeZone.UTC)
}
// Do we need a formatter?
override def format(value: DateTime): String = value.getMillis.toString
// Do we need a formatter?
override def parse(value: String): DateTime = {
try {
if (value == null ||
value.isEmpty ||
value.equalsIgnoreCase("NULL")) throw new Exception("Cannot produce a DateTime object from empty value")
// Do we need a formatter?
else new DateTime(value)
} catch {
// TODO: Determine the more specific exception that would be thrown in this case
case e: Exception =>
throw new InvalidTypeException(s"""Cannot parse DateTime from "$value"""", e)
}
}
}
and here is the test:
import com.datastax.driver.core.ProtocolVersion
import org.joda.time.{ DateTime, DateTimeZone }
import org.scalatest.FunSpec
class DateTimeCodecTest extends FunSpec {
describe("Serialization") {
it("should serialize between Cassandra types and org.joda.time.DateTime") {
val now = new DateTime().withZone(DateTimeZone.UTC)
val result = DateTimeCodec.deserialize(
// TODO: Verify correct ProtocolVersion for DSE 5.0
DateTimeCodec.serialize(now, ProtocolVersion.V4), ProtocolVersion.V4
)
assertResult(now)(result)
}
}
}
I make extensive use of the debugger within Intellij as well as the ability to quickly run a single test using some hotkeys. Losing the ability to compile within the IDE is almost as bad as losing the ability to compile at all. Any help would be appreciated, and I am more than happy to provide any additional information about my project // environment if anyone needs it.
Edit, update:
The project compiles within IntelliJ if I provide an instance of com.google.common.reflect.TypeToken as opposed to shade.com.datastax.spark.connector.google.common.reflect.TypeToken.
However, this breaks the build within sbt.
You must create a default constructor for DateTimeCodec.
I resolved the issue.
The issue stemmed from conflicting versions of spark-cassandra-connector on the classpath. Both shaded and unshaded versions of the dependency were on the classpath, and removing the shaded dependency fixed the issue.
I'm trying to properly stub the ehCache API used by Play Framework. In particular, its getOrElse function with signature:
def getOrElse[A: ClassTag](key: String, expiration: Duration)(orElse: => A)
Within my specs 2 code, I have:
val mockCache = mock[EhCacheApi]
mockCache.getOrElse[???](anyString,anyObject[Duration])(???) returns
[Object I'd like returned]
Question is if it's possible to use matchers for the ??? parts, especially for the currying argument part.
The return type for the CacheApi function should be Future[Seq[Object]] .
Public git repo link: Github
This works
class VariationAssignmentSpec(implicit ee: ExecutionEnv) extends PlaySpecification with Mockito {
case class Variation(id: Option[Long] = None)
lazy val v1 = Variation(Option(1L))
lazy val v2 = Variation(Option(2L))
"Cache#getOrElse" should {
"return correct result" in {
val mockCache = mock[CacheApi]
mockCache.getOrElse[Future[Seq[Variation]]](anyString, any[Duration])(any)(any) returns
Future(Seq(v1, v2))
val resultFuture: Future[Seq[Variation]] =
mockCache.getOrElse("cache.key", 10.seconds)(Future(Seq(v1,v2)))
resultFuture must equalTo(Seq(v1,v2)).await
}
}
}
Hello all :) I'm about 16 hours new to Scala and Play!Framework (version 2.1). I'm following this Play!2.0 tutorial with Anorm, which uses Jerkson. From what I understand, in 2.1 you can do that out of the box, as long as you have the right JSON Formatters.
So here is the JSON service:
def listBars() = Action {
val bars = Bar.findAll()
Ok(Json.toJson(bars)).as("application/json")
}
And here is Bar.scala:
case class Bar(id: Pk[Long], name: String)
object Bar {
implicit var anormLongPkFormat = new Format[Pk[Long]] {
def writes(key: Pk[Long]): JsValue = Json.toJson(key.toString)
def reads(jv: JsValue): JsResult[Pk[Long]] = JsSuccess( -?- )
}
implicit val barFormat = Json.format[Bar]
def findAll(): Seq[Bar] = {...}
}
I'm using Json.format[Bar], but it tells me he needs another formatter for anorm.Pk[Long]. I don't need the reads method, for the moment, I only want to serve the values; But the compiler needs a reads method. I'm totally at a loss at how to make it compile, let alone at how to write a good reads.
Best regards
If you don't need reads now, then the easiest way is not to implement it's logic and return an error:
def reads(jv: JsValue): JsResult[Pk[Long]] = JsError()
I'm new to the Play framework and scala and I'm trying to inject a dependency inside a companion object.
I have a simple case class, like:
case class Bar(foo: Int) {}
With a companion object like:
object Bar {
val myDependency =
if (isTest) {
// Mock
}
else
{
// Actual implementation
}
val form = Form(mapping(
"foo" -> number(0, 100).verifying(foo => myDependency.validate(foo)),
)(Bar.apply)(Bar.unapply))
}
This works fine, but it's not really a clean way to do it. I'd like to be able to inject the dependency at build time so that I can inject different mock objects when testing and different real implementations in development and production.
What's the best way to achieve this?
Any help really appreciated. Thanks!
Along the lines of the Cake, we can try to change your example to
trait Validator {
def validate(foo: Int): Boolean
}
trait TestValidation {
val validator = new Validator {
def validate(foo: Int): Boolean = ...
}
}
trait ImplValidation {
val validator = new Validator {
def validate(foo: Int): Boolean = ...
}
}
trait BarBehavior {
def validator: Validator
val form = Form(mapping(...))(Bar.apply)(Bar.unapply)
}
//use this in your tests
object TestBar extends BarBehavior with TestValidation
//use this in production
object ImplBar extends BarBehavior with ImplValidation
You should additionally try and test if this example fits well within the Play Framework, too