I need do some requests to different URLs, get data from their responses and put this info in one list, but i have some misunderstanding in this theme.
1) for one request i do
def doRequest: Future[WSResponse] = {
client
.url("MY_URL")
.withRequestTimeout(5000)
.get()}
Then I parse json in response to List of my objects:
def list: Future[List[FoobarEntry]] = {
doRequest.map {
response => {
val json = response.json \ "foobar"
json.validate[List[FoobarEntry]] match {
case js:JsSuccess[List[FoobarEntry]]=>
js.get
case e:JsError => Logger.error(JsError.toFlatJson(e).toString()); List()
}
}
}}
I think that for several url i should write some look like
def doRequests: List[Future[WSResponse]] = {
List(client
.url("URL_1")
.withRequestTimeout(5000)
.get(),
client
.url("URL_2")
.withRequestTimeout(5000)
.get())}
But how parse this list of Future[WSResponse] like my def list: Future[List[FoobarEntry]]?
Since you put the future responses inside of a list, you will have to map each future response with the parsing logic that turns it into a FoobarEntry. Like so:
val responseFutures: List[Future[WSResponse]] = ???
val foobarFutures: List[Future[FoobarEntry]] =
responseFutures.map(future => future.map(response => parse(response)))
Now you have a list of future parsed responses, but to do something when all of them has arrived you will need to sequence that list:
val futureFoobars = Future.sequence(foobarFutures)
So, sequence helps you to get from C[Future[A]] to Future[C[A]]
Use for-comprehensions.
val request1 = client.url("URL_1").withRequestTimeout(5000).get()
val request2 = client.url("URL_2").withRequestTimeout(5000).get()
val result: Future[List[FoobarEntry]] = for {
res1: WSResponse <- request1
res2: WSResponse <- request2
} yield List(res1, res2).map(parse).flatten
def parse(response: WSResponse): List[FoobarEntry] = {
val json = response.json \ "foobar"
json.validate[List[FoobarEntry]] match {
case js:JsSuccess[List[FoobarEntry]]=>
js.get
case e:JsError => Logger.error(JsError.toFlatJson(e).toString())
List()
}
Related
How we can implement the unit test cases for aws lamda serverless.
My code is
object Test1 extends RequestHandler[APIGatewayProxyRequestEvent, APIGatewayProxyResponseEvent] with ResponseObjProcess {
override def handleRequest(input: APIGatewayProxyRequestEvent, context: Context): APIGatewayProxyResponseEvent = {
var response = new APIGatewayProxyResponseEvent()
val gson = new Gson
val requestHttpMethod = input.getHttpMethod
val requestBody = input.getBody
val requestHeaders = input.getHeaders
val requestPath = input.getPath
val requestPathParameters = input.getPathParameters
val requestQueryStringParameters = input.getQueryStringParameters
val parsedBody = JSON.parseFull(requestBody).getOrElse(0).asInstanceOf[Map[String, String]]
println(" parsedBody is:: " + parsedBody)
val active = parsedBody.get("active").getOrElse("false")
val created = parsedBody.get("created").getOrElse("0").toLong
val updated = parsedBody.get("updated").getOrElse("0").toLong
requestHttpMethod match {
case "PUT" =>
println(" PUT Request method ")
// insertRecords("alert_summary_report", requestBody)
response.setStatusCode(200)
response.setBody(gson.toJson("PUT"))
case _ =>
println("")
response.setStatusCode(400)
response.setBody(gson.toJson("None"))
}
response
}
}
And I tried to implement unit test cases for the above code.
Below code is:
test("testing record success case") {
var request = new APIGatewayProxyRequestEvent();
request.setHttpMethod(Constants.PUTREQUESTMETHOD)
DELETEREQUESTBODY.put("id", "")
request.setBody(EMPTYREQUESTBODY)
request.setPathParameters(DELETEREQUESTBODY)
println(s"body = ${request.getBody}")
println(s"headers = ${request.getHeaders}")
val response = ProxyRequestMain.handleRequest(subject, testContext)
val assertEqual = response.getStatusCode.equals(200)
assertEqual
}
Actually, I'm getting response.getStatusCode=400 bad requests but test case passed how can I write handle this.
I am looking at your test code and it's not clear to me what you are trying to achieve with your assertions. I think you might have mixed quite a few things. In the code as it currently stands, you have a val, not assertion. I'd encourage you to have a look at the relevant docs and research the options available to you:
http://www.scalatest.org/user_guide/using_assertions
http://www.scalatest.org/user_guide/using_matchers
I have following pseudo code.
Invoke fetch, fetchRecordDetail, upload and notifyUploaded functions in sequence. Each function returns a future event but the first functions returns Option[T], going forward(fetchRecordDetail, upload and notifyUploaded) calls I need to carry only Some[T] type and ignore None.
Unfortunately I was able to achieve the following output with too many Await.ready calls.
Expected output
notified List(UploadResult(a_detail_uploaded), UploadResult(c_detail_uploaded))
Code
def fetch(id: String): Future[Option[Record]] = Future {
Thread sleep 100
if (id != "b" && id != "d") {
Some(Record(id))
} else None
}
def fetchRecordDetail(record: Record): Future[RecordDetail] = Future {
Thread sleep 100
RecordDetail(record.id + "_detail")
}
def upload(recordDetail: RecordDetail): Future[UploadResult] = Future {
Thread sleep 100
UploadResult(recordDetail.id + "_uploaded")
}
def notifyUploaded(results: Seq[UploadResult]): Future[Unit] = Future{ println("notified " + results)}
val result: Future[Unit] = //Final call to 'notifyUploaded' goes here
Await.ready(result, Duration.Inf)
Can someone help to improvise this code by avoiding Await.ready calls.
val ids: Seq[String] = Seq("a", "b", "c", "d")
def filterSome(s:String) = fetch(s) map ((s, _)) collect { case (s, Some(v)) => v }
val validData = ids map filterSome
Await.ready(Future.sequence(validData), Duration.Inf)
val records = validData.map(_.value.get.toOption)
val recordDetails = records.flatten map fetchRecordDetail
Await.ready(Future.sequence(recordDetails), Duration.Inf)
val uploadResult = recordDetails.map(_.value.get.toOption).flatten map upload
Await.ready(Future.sequence(uploadResult), Duration.Inf)
val seqUploadResult = uploadResult.map(_.value.get.toOption)
val result: Future[Unit] = notifyUploaded(seqUploadResult.flatten)
Await.ready(result, Duration.Inf)
This appears to work.
Future.sequence(ids.map(fetch)) //fetch Recs
.map(_.flatten) //remove None
.flatMap(rs=> Future.sequence(rs.map(fetchRecordDetail))) //fetch Details
.flatMap(ds=> Future.sequence(ds.map(upload))) //upload
.flatMap(notifyUploaded) //notify
It returns a Future[Unit] which you could Await() on, but I don't know why.
Something like that is what do you want?:
for {
f1 <- validData
f2 <- recordDetails
f3 <- seqUploadResult
}yield f3
onComplete(notifyUploaded(seqUploadResult.flatten))
I didn't find a way to get the body within a Filter in Play 2.5.x.
I want to create a "BadRequestLogFilter", which should log the request AND the result, if my appliation returns a status code 400-500
In Play 2.4.x, I used Iteratees and it worked.
I was not able to migrate this piece of code to Play 2.5.x. Can somebody give me a hint here? Maybe the hole approach to get the body in an Filter is an bad idea?
Here is my old (in 2.4.x working properly) Filter for Play 2.4.x
class BadRequestLogFilter #Inject() (implicit val mat: Materializer, ec: ExecutionContext) extends Filter {
val logger = Logger("bad_status").underlyingLogger
override def apply(next: (RequestHeader) => Future[Result])(request: RequestHeader): Future[Result] = {
val resultFuture = next(request)
resultFuture.foreach(result => {
val status = result.header.status
if (status < 200 || status >= 400) {
val c = Try(request.tags(Router.Tags.RouteController))
val a = Try(request.tags(Router.Tags.RouteActionMethod))
val body = result.body.run(Iteratee.fold(Array.empty[Byte]) { (memo, nextChunk) => memo ++ nextChunk })
val futResponse = body.map(bytes => new String(bytes))
futResponse.map { response =>
val m = Map("method" -> request.method,
"uri" -> request.uri,
"status" -> status,
"response" -> response,
"request" -> request,
"controller" -> c.getOrElse("empty"),
"actionMethod" -> a.getOrElse("empty"))
val msg = m.map { case (k, v) => s"$k=$v" }.mkString(", ")
logger.info(appendEntries(m), msg)
}
}
})
resultFuture
}
}
I guess I just need an valid replacement for this line here:
val body = result.body.run(Iteratee.fold(Array.empty[Byte]) { (memo, nextChunk) => memo ++ nextChunk })
The body of a result in Play 2.5.x is of type HttpEntity
So once you have the result you can get the body and then materialize it:
val body = result.body.consumeData(mat)
Here mat is the implicit Materializer you have. This is going to return you a Future[ByteString] which you can then decode to get a String representation (I have omitted future handling here for simplicity):
val bodyAsString = body.decodeString("UTF-8")
logger.info(bodyAsString)
I have the following problem.
I am querying a server for some data and getting it back as HttpEntity.Chunked.
The response String looks like this with up to 10.000.000 lines like this:
[{"name":"param1","value":122343,"time":45435345},
{"name":"param2","value":243,"time":4325435},
......]
Now I want to get the incoming data into and Array[String] where each String is a line from the response, because later on it should be imported into an apache spark dataframe.
Currently I am doing it likes this:
//For the http request
trait StartHttpRequest {
implicit val system: ActorSystem
implicit val materializer: ActorMaterializer
def httpRequest(data: String, path: String, targetPort: Int, host: String): Future[HttpResponse] = {
val connectionFlow: Flow[HttpRequest, HttpResponse, Future[OutgoingConnection]] = {
Http().outgoingConnection(host, port = targetPort)
}
val responseFuture: Future[HttpResponse] =
Source.single(RequestBuilding.Post(uri = path, entity = HttpEntity(ContentTypes.`application/json`, data)))
.via(connectionFlow)
.runWith(Sink.head)
responseFuture
}
}
//result of the request
val responseFuture: Future[HttpResponse] = httpRequest(.....)
//convert to string
responseFuture.flatMap { response =>
response.status match {
case StatusCodes.OK =>
Unmarshal(response.entity).to[String]
}
}
//and then something like this, but with even more stupid stuff
responseFuture.onSuccess { str:String =>
masterActor! str.split("""\},\{""")
}
My question is, what would be a better way to get the result into an array?
How can I unmarshall the response entity directly? Because .to[Array[String]] for example did not work. And because there are so many lines coming, could I do it with a stream, to be more efficent?
Answering your questions out of order:
How can I unmarshall the response entity directly?
There is an existing question & answer related to unmarshalling an Array of case classes.
what would be a better way to get the result into an array?
I would take advantage of the Chunked nature and use streams. This allows you to do string processing and json parsing concurrently.
First you need a container class and parser:
case class Data(name : String, value : Int, time : Long)
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val dataFormat = jsonFormat3(Data)
}
Then you have to do some manipulations to get the json objects to look right:
//Drops the '[' and the ']' characters
val dropArrayMarkers =
Flow[ByteString].map(_.filterNot(b => b == '['.toByte || b == ']'.toByte))
val preppendBrace =
Flow[String].map(s => if(!s.startsWith("{")) "{" + s else s)
val appendBrace =
Flow[String].map(s => if(!s.endsWith("}")) s + "}" else s)
val parseJson =
Flow[String].map(_.parseJson.convertTo[Data])
Finally, combine these Flows to convert a Source of ByteString into a Source of Data objects:
def strSourceToDataSource(source : Source[ByteString,_]) : Source[Data, _] =
source.via(dropArrayMarkers)
.via(Framing.delimiter(ByteString("},{"), 256, true))
.map(_.utf8String)
.via(prependBrace)
.via(appendBrace)
.via(parseJson)
This source can then be drained into an Seq of Data objects:
val dataSeq : Future[Seq[Data]] =
responseFuture flatMap { response =>
response.status match {
case StatusCodes.OK =>
strSourceToDataSource(response.entity.dataBytes).runWith(Sink.seq)
}
}
Over the past few days I have been trying to figure out the best way to download a HTTP resource to a file using Akka Streams and HTTP.
Initially I started with the Future-Based Variant and that looked something like this:
def downloadViaFutures(uri: Uri, file: File): Future[Long] = {
val request = Get(uri)
val responseFuture = Http().singleRequest(request)
responseFuture.flatMap { response =>
val source = response.entity.dataBytes
source.runWith(FileIO.toFile(file))
}
}
That was kind of okay but once I learnt more about pure Akka Streams I wanted to try and use the Flow-Based Variant to create a stream starting from a Source[HttpRequest]. At first this completely stumped me until I stumbled upon the flatMapConcat flow transformation. This ended up a little more verbose:
def responseOrFail[T](in: (Try[HttpResponse], T)): (HttpResponse, T) = in match {
case (responseTry, context) => (responseTry.get, context)
}
def responseToByteSource[T](in: (HttpResponse, T)): Source[ByteString, Any] = in match {
case (response, _) => response.entity.dataBytes
}
def downloadViaFlow(uri: Uri, file: File): Future[Long] = {
val request = Get(uri)
val source = Source.single((request, ()))
val requestResponseFlow = Http().superPool[Unit]()
source.
via(requestResponseFlow).
map(responseOrFail).
flatMapConcat(responseToByteSource).
runWith(FileIO.toFile(file))
}
Then I wanted to get a little tricky and use the Content-Disposition header.
Going back to the Future-Based Variant:
def destinationFile(downloadDir: File, response: HttpResponse): File = {
val fileName = response.header[ContentDisposition].get.value
val file = new File(downloadDir, fileName)
file.createNewFile()
file
}
def downloadViaFutures2(uri: Uri, downloadDir: File): Future[Long] = {
val request = Get(uri)
val responseFuture = Http().singleRequest(request)
responseFuture.flatMap { response =>
val file = destinationFile(downloadDir, response)
val source = response.entity.dataBytes
source.runWith(FileIO.toFile(file))
}
}
But now I have no idea how to do this with the Future-Based Variant. This is as far as I got:
def responseToByteSourceWithDest[T](in: (HttpResponse, T), downloadDir: File): Source[(ByteString, File), Any] = in match {
case (response, _) =>
val source = responseToByteSource(in)
val file = destinationFile(downloadDir, response)
source.map((_, file))
}
def downloadViaFlow2(uri: Uri, downloadDir: File): Future[Long] = {
val request = Get(uri)
val source = Source.single((request, ()))
val requestResponseFlow = Http().superPool[Unit]()
val sourceWithDest: Source[(ByteString, File), Unit] = source.
via(requestResponseFlow).
map(responseOrFail).
flatMapConcat(responseToByteSourceWithDest(_, downloadDir))
sourceWithDest.runWith(???)
}
So now I have a Source that will emit one or more (ByteString, File) elements for each File (I say each File since there is no reason the original Source has to be a single HttpRequest).
Is there anyway to take these and route them to a dynamic Sink?
I'm thinking something like flatMapConcat, such as:
def runWithMap[T, Mat2](f: T => Graph[SinkShape[Out], Mat2])(implicit materializer: Materializer): Mat2 = ???
So that I could complete downloadViaFlow2 with:
def destToSink(destination: File): Sink[(ByteString, File), Future[Long]] = {
val sink = FileIO.toFile(destination, true)
Flow[(ByteString, File)].map(_._1).toMat(sink)(Keep.right)
}
sourceWithDest.runWithMap {
case (_, file) => destToSink(file)
}
The solution does not require a flatMapConcat. If you don't need any return values from the file writing then you can use Sink.foreach:
def writeFile(downloadDir : File)(httpResponse : HttpResponse) : Future[Long] = {
val file = destinationFile(downloadDir, httpResponse)
httpResponse.entity.dataBytes.runWith(FileIO.toFile(file))
}
def downloadViaFlow2(uri: Uri, downloadDir: File) : Future[Unit] = {
val request = HttpRequest(uri=uri)
val source = Source.single((request, ()))
val requestResponseFlow = Http().superPool[Unit]()
source.via(requestResponseFlow)
.map(responseOrFail)
.map(_._1)
.runWith(Sink.foreach(writeFile(downloadDir)))
}
Note that the Sink.foreach creates Futures from the writeFile function. Therefore there's not much back-pressure involved. The writeFile could be slowed down by the hard drive but the stream would keep generating Futures. To control this you can use Flow.mapAsyncUnordered (or Flow.mapAsync) :
val parallelism = 10
source.via(requestResponseFlow)
.map(responseOrFail)
.map(_._1)
.mapAsyncUnordered(parallelism)(writeFile(downloadDir))
.runWith(Sink.ignore)
If you want to accumulate the Long values for a total count you need to combine with a Sink.fold:
source.via(requestResponseFlow)
.map(responseOrFail)
.map(_._1)
.mapAsyncUnordered(parallelism)(writeFile(downloadDir))
.runWith(Sink.fold(0L)(_ + _))
The fold will keep a running sum and emit the final value when the source of requests has dried up.
Using the play Web Services client injected in ws, remmebering to import scala.concurrent.duration._:
def downloadFromUrl(url: String)(ws: WSClient): Future[Try[File]] = {
val file = File.createTempFile("my-prefix", new File("/tmp"))
file.deleteOnExit()
val futureResponse: Future[WSResponse] =
ws.url(url).withMethod("GET").withRequestTimeout(5 minutes).stream()
futureResponse.flatMap { res =>
res.status match {
case 200 =>
val outputStream = java.nio.file.Files.newOutputStream(file.toPath)
val sink = Sink.foreach[ByteString] { bytes => outputStream.write(bytes.toArray) }
res.bodyAsSource.runWith(sink).andThen {
case result =>
outputStream.close()
result.get
} map (_ => Success(file))
case other => Future(Failure[File](new Exception("HTTP Failure, response code " + other + " : " + res.statusText)))
}
}
}