Scalatra JSON guide code not compiling - scala

I copied the instructions on the JSON formatting Guide found here: http://www.scalatra.org/2.2/guides/formats/json.html
Below is MyScalatraServlet.scala file with all of the code I embedded to test out JSON formatting:
package net.example.testapp
import org.scalatra._
import scalate.ScalateSupport
// JSON-related libraries
import org.json4s.{DefaultFormats, Formats}
// JSON handling support from Scalatra
import org.scalatra.json._
class MyScalatraServlet extends TestAppStack with JacksonJsonSupport {
get("/") {
FlowerData.all
}
}
// Sets up automatic case class to JSON output serialization, required by
// the JValueResult trait.
protected implicit val jsonFormats: Formats = DefaultFormats
case class Flower(slug: String, name: String)
object FlowerData {
/**
* Some fake flowers data so we can simulate retrievals.
*/
var all = List(
Flower("yellow-tulip", "Yellow Tulip"),
Flower("red-rose", "Red Rose"),
Flower("black-rose", "Black Rose"))
}
It seems the compiler does not like the following line:
protected implicit val jsonFormats: Formats = DefaultFormats
Here's the error message:
[error] /Users/test/test-app/src/main/scala/net/example/testapp/MyScalatraServlet.scala:22: expected start of definition
[error] protected implicit val jsonFormats: Formats = DefaultFormats
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
[error] Total time: 2 s, completed Jun 17, 2013 4:04:34 PM
>

Your val jsonFormats is just kinda floating there, not tied to a class or anything. A val needs to be defined as part of another construct like a trait, class or object. Try moving it inside of that servlet class, just before the call to get("/")

Related

Cannot find an implicit value for ContextShift

I am trying to create webapp with http4s that is based on Http4sServlet.
The following code does not compile:
import cats.effect._
import org.http4s.servlet.BlockingServletIo
import org.http4s.servlet.Http4sServlet
import scala.concurrent.ExecutionContext.global
import org.http4s.implicits._
class UserSvcServlet
extends Http4sServlet[IO](service = UserSvcServer.start
, servletIo = BlockingServletIo(4096, Blocker.liftExecutionContext(global)))(IOApp)
the error message:
[error] /home/developer/scala/user-svc/src/main/scala/io/databaker/UserSvcServlet.scala:12:54: Cannot find implicit value for ConcurrentEffect[[+A]cats.effect.IO[A]].
[error] Building this implicit value might depend on having an implicit
[error] s.c.ExecutionContext in scope, a Scheduler, a ContextShift[[+A]cats.effect.IO[A]]
[error] or some equivalent type.
[error] extends Http4sServlet[IO]( service = UserSvcServer.stream
[error] ^
[error] /home/developer/scala/user-svc/src/main/scala/io/databaker/UserSvcServlet.scala:13:36: Cannot find an implicit value for ContextShift[[+A]cats.effect.IO[A]]:
[error] * import ContextShift[[+A]cats.effect.IO[A]] from your effects library
[error] * if using IO, use cats.effect.IOApp or build one with cats.effect.IO.contextShift
[error] , servletIo = BlockingServletIo(4096, Blocker.liftExecutionContext(global)))
[error] ^
[error] two errors found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed May 29, 2020, 8:45:00 PM
The UserSvcServer is implemented as follows:
import org.http4s.HttpApp
import cats.effect.{ConcurrentEffect, ContextShift, Timer}
import org.http4s.implicits._
import org.http4s.server.middleware.Logger
object UserSvcServer {
def start[F[_] : ConcurrentEffect](implicit T: Timer[F], C: ContextShift[F]): HttpApp[F] = {
val helloWorldAlg = HelloWorld.impl[F]
val httpApp = UserSvcRoutes.helloWorldRoutes[F](helloWorldAlg).orNotFound
Logger.httpApp(true, true)(httpApp)
}
}
How can I import ContextShift implicitly?
Context shift is just cats' wrapper over ExecutionContext. You can create one explicitly as stated in docs:
implicit val cs: ContextShift[IO] = IO.contextShift(ExecutionContext.global)
You usually create one context shift at the entry point of your app (probably in the main method).
If your app uses IOApp from cats-effect it would have already implicit for contextShift in scope. It will use execution context, which would have number of threads equal to available processors of your computer.
If you want to use created contextShift "deeper" inside your application you can pass it as an implicit parameter:
def doSomething(implicit cs: ContextShift[IO]): IO[Unit] = ???
So in order to make your code work, you need to make sure that method or class calls constructor of UserSvcServlet has implicit for contextShift:
(implicit cs: ContextShift[IO]).
You could also put it in separate object:
object AppContextShift {
implicit val cs: ContextShift[IO] = IO.contextShift(ExecutionContext.global)
implicit val t: Timer[IO] = IO.timer(ExecutionContext.global) //you will probably also need timer eventually
}
Then you can import it when contextShift is needed:
import AppContextShift._
By the way, using a global execution context for Blocker is not good idea.
Blocked is used for blocking operations and using it with ExecutionContext.global might lead to thread starvation in your app.
The most common approach is to use blocker created from cached thread pool:
Blocker.liftExecutorService(Executors.newCachedThreadPool())

using datetime/timestamp in scala slick

is there an easy way to use datetime/timestamp in scala? What's best practice? I currently use "date" to persist data, but I'd also like to persist the current time.
I'm struggling to set the date. This is my code:
val now = new java.sql.Timestamp(new java.util.Date().getTime)
I also tried to do this:
val now = new java.sql.Date(new java.util.Date().getTime)
When changing the datatype in my evolutions to "timestamp", I got an error:
case class MyObjectModel(
id: Option[Int],
title: String,
createdat: Timestamp,
updatedat: Timestamp,
...)
object MyObjectModel{
implicit val myObjectFormat = Json.format[MyObjectModel]
}
Console:
app\models\MyObjectModel.scala:31: No implicit format for
java.sql.Timestamp available.
[error] implicit val myObjectFormat = Json.format[MyObjectModel]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Update:
object ProcessStepTemplatesModel {
implicit lazy val timestampFormat: Format[Timestamp] = new Format[Timestamp] {
override def reads(json: JsValue): JsResult[Timestamp] = json.validate[Long].map(l => Timestamp.from(Instant.ofEpochMilli(l)))
override def writes(o: Timestamp): JsValue = JsNumber(o.getTime)
}
implicit val processStepFormat = Json.format[ProcessStepTemplatesModel]
}
try using this in your code
implicit object timestampFormat extends Format[Timestamp] {
val format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SS'Z'")
def reads(json: JsValue) = {
val str = json.as[String]
JsSuccess(new Timestamp(format.parse(str).getTime))
}
def writes(ts: Timestamp) = JsString(format.format(ts))
}
it is (de)serialized in a JS compatible format like the following "2018-01-06T18:31:29.436Z"
please note: the implicit object shall be decleared in the code before it is used
I guess your question is handled in What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?.
Go with Java 8 "java.time".
In the subject you mention Slick (Scala Database Library) but the error you got comes from a Json library and it says that you don't have a converter for java.sql.Timestamp to Json. Without knowing which Json library you are using it's hard to help you with a working example.

found java.util.Date but required java.sql.Date?

I'm trying to create a function to check if a string is a date. However, the following function got the error.
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.sql._
import scala.util.{Success, Try}
def validateDate(date: String): Boolean = {
val df = new java.text.SimpleDateFormat("yyyyMMdd")
val test = Try[Date](df.parse(date))
test match {
case Success(_) => true
case _ => false
}
}
Error:
[error] C:\Users\user1\IdeaProjects\sqlServer\src\main\scala\main.scala:14: type mismatch;
[error] found : java.util.Date
[error] required: java.sql.Date
[error] val test = Try[Date](df.parse(date))
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 17, 2017 1:19:33 PM
Is there a simpler way to validate if a string is a date without create a function?
The function is used to validate the command line argument.
if (args.length != 2 || validateDate(args(0))) { .... }
Try[Date](df.parse(date)) You are not interested in type here because you ignore it. So simply omit type parameter. Try(df.parse(date)).
Your function could be shorter. Try(df.parse(date)).isSuccess instead pattern matching.
If your environment contains java 8 then use java.time package always.
import scala.util.Try
import java.time.LocalDate
import java.time.format.DateTimeFormatter
// Move creation of formatter out of function to reduce short lived objects allocation.
val df = DateTimeFormatter.ofPattern("yyyy MM dd")
def datebleStr(s: String): Boolean = Try(LocalDate.parse(s,df)).isSuccess
use this: import java.util.Date

ReactiveMongo with Play: method gridFSBodyParser in trait MongoController is deprecated: Use `gridFSBodyParser` with `Future[GridFS]`

I have the following code in my controller in PlayFramework:
import reactivemongo.api.gridfs.GridFS
private val gridFS = reactiveMongoApi.gridFS
def uploadLogo(companyId: String) = Action.async(gridFSBodyParser(gridFS)) { request =>
...
...
}
It gives me the following error:
method gridFSBodyParser in trait MongoController is deprecated: Use gridFSBodyParser with Future[GridFS]
I tried changing the import private val gridFS = reactiveMongoApi.gridFS to private val gridFS = reactiveMongoApi.asyncGridFS but then I get an error as below:
[error] found : scala.concurrent.Future[reactivemongo.api.gridfs.GridFS[reactivemongo.play.json.JSONSerializationPack.type]]
[error] required: reactivemongo.api.gridfs.GridFS[reactivemongo.play.json.JSONSerializationPack.type]
What am I missing?
Thanks in advance!

Op-Rabbit with Spray-Json in Akka Http

I am trying to use the library Op-Rabbit to consume a RabbitMQ queue in an Akka-Http project.
I want to use Spray-Json for the marshalling/ un marshalling.
import com.spingo.op_rabbit.SprayJsonSupport._
import com.spingo.op_rabbit.stream.RabbitSource
import com.spingo.op_rabbit.{Directives, RabbitControl}
object Boot extends App with Config with BootedCore with ApiService {
this: ApiService with Core =>
implicit val materializer = ActorMaterializer()
Http().bindAndHandle(routes, httpInterface, httpPort)
log.info("Http Server started")
implicit val rabbitControl = system.actorOf(Props[RabbitControl])
import Directives._
RabbitSource(
rabbitControl,
channel(qos = 3),
consume(queue(
"such-queue",
durable = true,
exclusive = false,
autoDelete = false)),
body(as[User])).
runForeach { user =>
log.info(user)
} // after each successful iteration the message is acknowledged.
}
In a separate file:
case class User(id: Long,name: String)
object JsonFormat extends DefaultJsonProtocol {
implicit val format = jsonFormat2(User)
}
The error I am getting is:
could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.FromRequestUnmarshaller[*.*.models.User]
[error] body(as[User])). // marshalling is automatically hooked up using implicits
[error] ^
[error]could not find implicit value for parameter um: com.spingo.op_rabbit.RabbitUnmarshaller[*.*.models.User]
[error] body(as[User])
[error] ^
[error] two errors found
Im not sure how to get the op-rabbit spray-json support working properly.
Thanks for any help.
Try to provide an implicit marshaller for your User class like they do it for Int (in RabbitTestHelpers.scala):
implicit val simpleIntMarshaller = new RabbitMarshaller[Int] with RabbitUnmarshaller[Int] {
val contentType = "text/plain"
val contentEncoding = Some("UTF-8")
def marshall(value: Int) =
value.toString.getBytes
def unmarshall(value: Array[Byte], contentType: Option[String], charset: Option[String]) = {
new String(value).toInt
}
}