How to debug akka http's route dsl - scala

I am attempting to create a unmarshaller for akka http, going from avro to a custom case class. But it gives me a very vague error: "could not find implicit value". How can I debug this or make scala give me hint where the problem is?
I set up the route as such:
class MetricsRoute(implicit val system: ActorSystem, implicit val materializer: ActorMaterializer) {
import system.dispatcher
def getRoute() = {
path("metrics") {
put {
decodeRequest {
entity(as[Metrics]) { metrics: Metrics =>
println(metrics.time)
complete(HttpEntity(ContentTypes.`text/html(UTF-8)`, "<h1>hi!</h1>"))
}
}
}
}
}
In the same class I also created the unmarshaller like this:
implicit def avroUnmarshaller(): FromRequestUnmarshaller[Metrics] =
Unmarshaller.withMaterializer {
implicit ex: ExecutionContext =>
implicit mat: Materializer =>
request: HttpRequest => {
val inputStream = request.entity.dataBytes.runWith(
StreamConverters.asInputStream(FiniteDuration(3, TimeUnit.SECONDS))
)
val reader = new SpecificDatumReader[AvroMetrics](classOf[AvroMetrics])
val decoder:BinaryDecoder = DecoderFactory.get().binaryDecoder(inputStream, null)
//AvroMetrics is a case class generated from the avro schema
val avroMetrics:AvroMetrics = AvroMetrics(0, 0, List())
reader.read(avroMetrics, decoder)
Future {
//converts the avro case class to the case class specific for my application
convertMetrics(avroMetrics)
}
}
}
But this gives me the very vague 'could not find implicit value' error:
[error] /mypath/MetricsRoute.scala:34: could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.FromRequestUnmarshaller[my.package.types.Metrics]
[error] entity(as[Metrics]) { metrics: Metrics =>
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
How do I go about debugging what is missing or what I did wrong?
Edit:
Worth noting is when I specify the unmarshaller myself it does work, so changing
entity(as[Metrics]) { metrics: Metrics =>
to
entity(avroUnmarshaller) { metrics: Metrics =>
Which seems to indicate the marshaller code itself isn't wrong, but I've done something wrong with the types?

The compiler is looking for a FromRequestUnmarshaller[Metrics], what you defined is of type () => FromRequestUnmarshaller[Metrics].
Try defining your implicit with no empty braces, e.g.
implicit def avroUnmarshaller: FromRequestUnmarshaller[Metrics] = ???
instead of
implicit def avroUnmarshaller(): FromRequestUnmarshaller[Metrics] = ???
(also, it could be made a val, but that's not relevant to this issue)

Related

How to use TypeInformation in a generic method using Scala

I'm trying to create a generic method in Apache Flink to parse a DataSet[String](JSON strings) using case classes. I tried to use the TypeInformation like it's mentioned here: https://ci.apache.org/projects/flink/flink-docs-stable/dev/types_serialization.html#generic-methods
I'm using liftweb to parse the JSON string, this is my code:
import net.liftweb.json._
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.scala._
class Loader(settings: Map[String, String])(implicit environment: ExecutionEnvironment) {
val env: ExecutionEnvironment = environment
def load[T: TypeInformation](): DataSet[T] = {
val data: DataSet[String] = env.fromElements(
"""{"name": "name1"}""",
"""{"name": "name2"}"""
)
implicit val formats = DefaultFormats
data.map(item => parse(item).extract[T])
}
}
But I got the error:
No Manifest available for T
data.map(item => parse(item).extract[T])
Then I tried to add a Manifest and delete the TypeInformation like this:
def load[T: Manifest](): DataSet[T] = { ...
And I got the next error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[T]
I'm very confuse about this, I'll really appreciate your help.
Thanks.

using datetime/timestamp in scala slick

is there an easy way to use datetime/timestamp in scala? What's best practice? I currently use "date" to persist data, but I'd also like to persist the current time.
I'm struggling to set the date. This is my code:
val now = new java.sql.Timestamp(new java.util.Date().getTime)
I also tried to do this:
val now = new java.sql.Date(new java.util.Date().getTime)
When changing the datatype in my evolutions to "timestamp", I got an error:
case class MyObjectModel(
id: Option[Int],
title: String,
createdat: Timestamp,
updatedat: Timestamp,
...)
object MyObjectModel{
implicit val myObjectFormat = Json.format[MyObjectModel]
}
Console:
app\models\MyObjectModel.scala:31: No implicit format for
java.sql.Timestamp available.
[error] implicit val myObjectFormat = Json.format[MyObjectModel]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Update:
object ProcessStepTemplatesModel {
implicit lazy val timestampFormat: Format[Timestamp] = new Format[Timestamp] {
override def reads(json: JsValue): JsResult[Timestamp] = json.validate[Long].map(l => Timestamp.from(Instant.ofEpochMilli(l)))
override def writes(o: Timestamp): JsValue = JsNumber(o.getTime)
}
implicit val processStepFormat = Json.format[ProcessStepTemplatesModel]
}
try using this in your code
implicit object timestampFormat extends Format[Timestamp] {
val format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SS'Z'")
def reads(json: JsValue) = {
val str = json.as[String]
JsSuccess(new Timestamp(format.parse(str).getTime))
}
def writes(ts: Timestamp) = JsString(format.format(ts))
}
it is (de)serialized in a JS compatible format like the following "2018-01-06T18:31:29.436Z"
please note: the implicit object shall be decleared in the code before it is used
I guess your question is handled in What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?.
Go with Java 8 "java.time".
In the subject you mention Slick (Scala Database Library) but the error you got comes from a Json library and it says that you don't have a converter for java.sql.Timestamp to Json. Without knowing which Json library you are using it's hard to help you with a working example.

500 Internal Server Error in Akka Scala server

This is my code for the server written using Akka framework:
case class Sentence(data: String)
case class RawTriples(triples: List[String])
trait Protocols extends DefaultJsonProtocol {
implicit val sentenceRequestFormat = jsonFormat1(Sentence)
implicit val rawTriplesFormat = jsonFormat1(RawTriples)
}
trait Service extends Protocols {
implicit val system: ActorSystem
implicit def executor: ExecutionContextExecutor
implicit val materializer: Materializer
val openie = new OpenIE
def config: Config
val logger: LoggingAdapter
lazy val ipApiConnectionFlow: Flow[HttpRequest, HttpResponse, Any] =
Http().outgoingConnection(config.getString("services.ip-api.host"), config.getInt("services.ip-api.port"))
def ipApiRequest(request: HttpRequest): Future[HttpResponse] = Source.single(request).via(ipApiConnectionFlow).runWith(Sink.head)
val routes = {
logRequestResult("akka-http-microservice") {
pathPrefix("openie") {
post {
decodeRequest{
entity(as[Sentence]){ sentence =>
complete {
var rawTriples = openie.extract(sentence.data)
val resp: MutableList[String] = MutableList()
for(rtrip <- rawTriples){
resp += (rtrip.toString())
}
val response: List[String] = resp.toList
println(response)
response
}
}
}
}
}
}
}
}
object AkkaHttpMicroservice extends App with Service {
override implicit val system = ActorSystem()
override implicit val executor = system.dispatcher
override implicit val materializer = ActorMaterializer()
override val config = ConfigFactory.load()
override val logger = Logging(system, getClass)
Http().bindAndHandle(routes, config.getString("http.interface"), config.getInt("http.port"))
}
The server accepts a POST request containing a sentence and returns a json array in return. It works fine but if I am making requests to it too frequently using parallelized code, then it gives 500 Internal server error. I wanted to know is there any parameter which I can set in the server to avoid that (number of ready threads for accepting requests etc).
In log files, the error is logged as:
[ERROR] [05/31/2017 11:48:38.110]
[default-akka.actor.default-dispatcher-6]
[akka.actor.ActorSystemImpl(default)] Error during processing of
request: 'null'. Completing with 500 Internal Server Error response.
The doc on the bindAndHandle method shows what you want:
/**
* Convenience method which starts a new HTTP server at the given endpoint and uses the given `handler`
* [[akka.stream.scaladsl.Flow]] for processing all incoming connections.
*
* The number of concurrently accepted connections can be configured by overriding
* the `akka.http.server.max-connections` setting. Please see the documentation in the reference.conf for more
* information about what kind of guarantees to expect.
*
* To configure additional settings for a server started using this method,
* use the `akka.http.server` config section or pass in a [[akka.http.scaladsl.settings.ServerSettings]] explicitly.
*/
akka.http.server.max-connections is probably what you want. As the doc suggests, you can also dig deeper into the akka.http.server config section.
Add following in application.conf file
akka.http {
server {
server-header = akka-http/${akka.http.version}
idle-timeout = infinite
request-timeout = infinite
}
}

Op-Rabbit with Spray-Json in Akka Http

I am trying to use the library Op-Rabbit to consume a RabbitMQ queue in an Akka-Http project.
I want to use Spray-Json for the marshalling/ un marshalling.
import com.spingo.op_rabbit.SprayJsonSupport._
import com.spingo.op_rabbit.stream.RabbitSource
import com.spingo.op_rabbit.{Directives, RabbitControl}
object Boot extends App with Config with BootedCore with ApiService {
this: ApiService with Core =>
implicit val materializer = ActorMaterializer()
Http().bindAndHandle(routes, httpInterface, httpPort)
log.info("Http Server started")
implicit val rabbitControl = system.actorOf(Props[RabbitControl])
import Directives._
RabbitSource(
rabbitControl,
channel(qos = 3),
consume(queue(
"such-queue",
durable = true,
exclusive = false,
autoDelete = false)),
body(as[User])).
runForeach { user =>
log.info(user)
} // after each successful iteration the message is acknowledged.
}
In a separate file:
case class User(id: Long,name: String)
object JsonFormat extends DefaultJsonProtocol {
implicit val format = jsonFormat2(User)
}
The error I am getting is:
could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.FromRequestUnmarshaller[*.*.models.User]
[error] body(as[User])). // marshalling is automatically hooked up using implicits
[error] ^
[error]could not find implicit value for parameter um: com.spingo.op_rabbit.RabbitUnmarshaller[*.*.models.User]
[error] body(as[User])
[error] ^
[error] two errors found
Im not sure how to get the op-rabbit spray-json support working properly.
Thanks for any help.
Try to provide an implicit marshaller for your User class like they do it for Int (in RabbitTestHelpers.scala):
implicit val simpleIntMarshaller = new RabbitMarshaller[Int] with RabbitUnmarshaller[Int] {
val contentType = "text/plain"
val contentEncoding = Some("UTF-8")
def marshall(value: Int) =
value.toString.getBytes
def unmarshall(value: Array[Byte], contentType: Option[String], charset: Option[String]) = {
new String(value).toInt
}
}

Scala specs2 mocking a trait method returns always Nullpointer exception

I have a trait that I want to mock and use that mocked Trait in another Service during testing. The problem is, that I receive a Nullpointerexception when I try to mock the return value of the indexDocuments function.
Testmethod:
"createDemand must return None if writing to es fails" in new WithApplication {
val demandDraft = DemandDraft(UserId("1"), "socken bekleidung wolle", Location(Longitude(52.468562), Latitude(13.534212)), Distance(30), Price(25.0), Price(77.0))
val es = mock[ElasticsearchClient]
val sphere = mock[SphereClient]
val productTypes = mock[ProductTypes]
sphere.execute(any[ProductCreateCommand]) returns Future.successful(product)
productTypes.demand returns ProductTypeBuilder.of("demand", ProductTypeDrafts.demand).build()
// this line throws the nullpointer exception
es.indexDocument(any[IndexName], any[TypeName], any[JsValue]) returns Future.failed(new RuntimeException("test exception"))
val demandService = new DemandService(es, sphere, productTypes)
demandService.createDemand(demandDraft) must be (Option.empty[Demand]).await
}
Trait:
sealed trait ElasticsearchClient {
implicit def convertListenableActionFutureToScalaFuture[T](x: ListenableActionFuture[T]): Future[T] = {
val p = Promise[T]()
x.addListener(new ActionListener[T] {
def onFailure(e: Throwable) = p.failure(e)
def onResponse(response: T) = p.success(response)
})
p.future
}
lazy val client = createElasticsearchClient()
def close(): Unit
def createElasticsearchClient(): Client
def indexDocument(esIndex: IndexName, esType: TypeName, doc: JsValue): Future[IndexResponse] =
client.prepareIndex(esIndex.value, esType.value).setSource(doc.toString()).execute()
def search(esIndex: IndexName, esType: TypeName, query: QueryBuilder): Future[SearchResponse] =
client.prepareSearch(esIndex.value).setTypes(esType.value).setQuery(query).execute()
}
Exception
[error] NullPointerException: (DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$$anonfun$8.apply(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$$anonfun$8.apply(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2.delayedEndpoint$services$DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$1(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$delayedInit$body.apply(DemandServiceSpec.scala:81)
[error] play.api.test.WithApplication$$anonfun$around$1.apply(Specs.scala:23)
[error] play.api.test.WithApplication$$anonfun$around$1.apply(Specs.scala:23)
[error] play.api.test.PlayRunners$class.running(Helpers.scala:49)
[error] play.api.test.Helpers$.running(Helpers.scala:403)
[error] play.api.test.WithApplication.around(Specs.scala:23)
[error] play.api.test.WithApplication.delayedInit(Specs.scala:20)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2.<init>(DemandServiceSpec.scala:81)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8.apply(DemandServiceSpec.scala:81)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8.apply(DemandServiceSpec.scala:81)
Please let me know if you need additional information.
I found out that the any[] Matchers in the indexDocuments call are the problem. When I replace them with the actual values it works:
"createDemand must return None if writing to es fails and deleteDemand should be called once with correct parameters" in new WithApplication {
val demandDraft = DemandDraft(UserId("1"), "socken bekleidung wolle", Location(Longitude(52.468562), Latitude(13.534212)), Distance(30), Price(25.0), Price(77.0))
val es = mock[ElasticsearchClient]
val sphere = mock[SphereClient]
val productTypes = mock[ProductTypes]
sphere.execute(any[ProductCreateCommand]) returns Future.successful(product)
sphere.execute(any[ProductDeleteByIdCommand]) returns Future.successful(product)
productTypes.demand returns ProductTypeBuilder.of("demand", ProductTypeDrafts.demand).build()
es.indexDocument(IndexName("demands"), TypeName("demands"), Json.toJson(demand)) returns Future.failed(new RuntimeException("test exception"))
val demandService = new DemandService(es, sphere, productTypes)
demandService.createDemand(demandDraft) must be (Option.empty[Demand]).await
}
I've had this happen a whole bunch and work around it by creating a class (rather than a trait) to feed to mock:
trait SomeTraitYouWantToMock {
…
}
class MockableSomeTraitYouWantToMock extends SomeTraitYouWantToMock
val whatever = mock[MockableSomeTraitYouWantToMock]