How to get Response Body from End point ? I am Sending request to Endpoint, I want to know, how to get the response string.
val complexRequest = ws.url(serviceEndpoint).withHeaders("Content-Type" -> "application/xml")
val result = complexRequest.post(leadXml).map { response =>
logger.info(s"response $response")
if (response.status == 200) {
val res = response
logger.info(s"status passed.. $res")
}
else {
val res = response
logger.info(s"status failed.. $res")
}
}
response.body you can also use PlayJson to validate it and change it to a workable object!
You can use Helper class
import play.api.test.Helpers._
val result: Future[SimpleResult] = …
val bodyAsBytes: Array[Byte] = contentAsBytes(result)
Or JavaResultExtractor
akka.util.ByteString body = play.core.j.JavaResultExtractor.getBody(result, 10000l, mat);
Or JavaBodyPartser
https://www.playframework.com/documentation/2.5.x/JavaBodyParsers
Related
I am using WebsocketClient and would like to test against the received message. I've chosen the Scalatest framework and I know, that the test has be carry out asynchronously.
The websocket client looks as the following:
import akka.{Done}
import akka.http.scaladsl.Http
import akka.stream.scaladsl._
import akka.http.scaladsl.model.ws._
import io.circe.syntax._
import scala.concurrent.Future
object WsClient {
import Trigger._
private val convertJson: PreMsg => String = msg =>
msg.asJson.noSpaces
val send: PreMsg => (String => Unit) => RunnableGraph[Future[Done]] = msg => fn =>
Source.single(convertJson(msg))
.map(TextMessage(_))
.via(Http().webSocketClientFlow(WebSocketRequest(s"ws://{$Config.host}:{$Config.port}/saprs")))
.map(_.asTextMessage.getStrictText)
.toMat(Sink.foreach(fn))(Keep.right)
}
and the test:
feature("Process incoming messages") {
info("As a user, I want that incoming messages is going to process appropriately.")
info("A message should contain the following properties: `sap_id`, `sap_event`, `payload`")
scenario("Message is not intended for the server") {
Given("A message with `sap_id:unknown`")
val msg = PreMsg("unknown", "unvalid", "{}")
When("the message gets validated")
val ws = WsClient.send(msg)
Then("it should has the `status: REJECT` in the response content")
ws { msg =>
//Would like test against the msg here
}.run()
.map(_ => assert(1 == 1))
}
I would to test against the content of msg, but I do not know, how to do it.
I followed the play-scala-websocket-example
They use a WebSocketClient as a helper, see WebSocketClient.java
Then a test looks like:
Helpers.running(TestServer(port, app)) {
val myPublicAddress = s"localhost:$port"
val serverURL = s"ws://$myPublicAddress/ws"
val asyncHttpClient: AsyncHttpClient = client.underlying[AsyncHttpClient]
val webSocketClient = new WebSocketClient(asyncHttpClient)
val queue = new ArrayBlockingQueue[String](10)
val origin = serverURL
val consumer: Consumer[String] = new Consumer[String] {
override def accept(message: String): Unit = queue.put(message)
}
val listener = new WebSocketClient.LoggingListener(consumer)
val completionStage = webSocketClient.call(serverURL, origin, listener)
val f = FutureConverters.toScala(completionStage)
// Test we can get good output from the websocket
whenReady(f, timeout = Timeout(1.second)) { webSocket =>
val condition: Callable[java.lang.Boolean] = new Callable[java.lang.Boolean] {
override def call(): java.lang.Boolean = webSocket.isOpen && queue.peek() != null
}
await().until(condition)
val input: String = queue.take()
val json:JsValue = Json.parse(input)
val symbol = (json \ "symbol").as[String]
List(symbol) must contain oneOf("AAPL", "GOOG", "ORCL")
}
}
}
See here: FunctionalSpec.scala
I am trying to write a unit test for a specific method which calls a REST endpoint with given inputs(url, http method, body, headers). Below is the code.
def genericAPICall(uri: String, method: String, headers: Map[String, String], body: HttpEntity): APIResponse = {
import org.apache.http.impl.client.HttpClientBuilder
import org.apache.http.util.EntityUtils
import java.nio.charset.StandardCharsets
val client = HttpClientBuilder.create.build
val request = method match {
case "GET" => Some(new HttpGet(uri))
case "POST" => {
val post = new HttpPost(uri)
post.setEntity(body)
Some(post)
}
case "PUT" => {
val put = new HttpPut(uri)
put.setEntity(body)
Some(put)
}
case "DELETE" => Some(new HttpDelete(uri))
case _ => None
}
if (request.isDefined) {
val actualRequest = request.get
if (headers.nonEmpty) {
for ((headerName,headerVal) <- headers) {
actualRequest.addHeader(headerName,headerVal)
}
} else {
actualRequest.addHeader("Accept", "application/json")
actualRequest.addHeader("Content-Type", "application/json")
}
val response: CloseableHttpResponse = null
try {
val response = client.execute(actualRequest)
val entity = response.getEntity
// use org.apache.http.util.EntityUtils to read json as string
val str = EntityUtils.toString(entity, StandardCharsets.UTF_8)
APIResponse(response.getStatusLine.getStatusCode(), str, null)
} catch {
case e: Exception => APIResponse(500, null, e)
}finally {
if (response != null)
response.close()
}
} else {
APIResponse(500, null, new Exception("not a valid http method"))
}}
Is there anyway i can mock below client.execute ? so i can avoid actual call to my test url ?
val response = client.execute(actualRequest)
i am using specs2 already in my project so it would be great if anybody knows how to achieve this using specs2.
I made a Source for an Akka Stream based on a ReactiveStreams Publisher like this:
object FlickrSource {
val apiKey = Play.current.configuration.getString("flickr.apikey")
val flickrUserId = Play.current.configuration.getString("flickr.userId")
val flickrPhotoSearchUrl = s"https://api.flickr.com/services/rest/?method=flickr.photos.search&api_key=$apiKey&user_id=$flickrUserId&min_taken_date=%s&max_taken_date=%s&format=json&nojsoncallback=1&page=%s&per_page=500"
def byDate(date: LocalDate): Source[JsValue, Unit] = {
Source(new FlickrPhotoSearchPublisher(date))
}
}
class FlickrPhotoSearchPublisher(date: LocalDate) extends Publisher[JsValue] {
override def subscribe(subscriber: Subscriber[_ >: JsValue]) {
try {
val from = new LocalDate()
val fromSeconds = from.toDateTimeAtStartOfDay.getMillis
val toSeconds = from.plusDays(1).toDateTimeAtStartOfDay.getMillis
def pageGet(page: Int): Unit = {
val url = flickrPhotoSearchUrl format (fromSeconds, toSeconds, page)
Logger.debug("Flickr search request: " + url)
val photosFound = WS.url(url).get().map { response =>
val json = response.json
val photosThisPage = (json \ "photos" \ "photo").as[JsArray]
val numPages = (json \ "photos" \ "pages").as[JsNumber].value.toInt
Logger.debug(s"pages: $numPages")
Logger.debug(s"photos this page: ${photosThisPage.value.size}")
photosThisPage.value.foreach { photo =>
Logger.debug(s"onNext")
subscriber.onNext(photo)
}
if (numPages > page) {
Logger.debug("nextPage")
pageGet(page + 1)
} else {
Logger.debug("onComplete")
subscriber.onComplete()
}
}
}
pageGet(1)
} catch {
case ex: Exception => {
subscriber.onError(ex)
}
}
}
}
It will make a search request to Flickr and source the results as JsValues. I tried to wire it to lots of different Flows and Sinks, but this would be the most basic setup:
val source: Source[JsValue, Unit] = FlickrSource.byDate(date)
val sink: Sink[JsValue, Future[Unit]] = Sink.foreach(println)
val stream = source.toMat(sink)(Keep.right)
stream.run()
I see that the onNext gets called a couple of times, and then the onComplete. However, the Sink does not receive anything. What am I missing, is this not a valid way to create a Source?
I mistakenly understood that Publisher was a simple interface like Observable, that you can implement yourself. The Akka team pointed out that this is not the correct way to implement a Publisher. In fact Publisher is a complicated class that is supposed to be implemented by libraries, rather than end users. This Source.apply(Publisher) method used in the question is there for interoperability with other Reactive Streams implementations.
The purpose for wanting an implementation of Source is that I want a backpressured source to fetch the search results from Flickr (which is maximized at 500 per request) and I don't want to make more (or faster) requests than is needed downstream. This can be achieved by implementing an ActorPublisher.
Update
This is the ActorPublisher that does what I want: create a Source that produces search results, but only makes as many REST calls as are needed downstream. I think there is still room for improvement, so feel free to edit it.
import akka.actor.Props
import akka.stream.actor.ActorPublisher
import akka.stream.actor.ActorPublisherMessage.{Cancel, Request}
import org.joda.time.LocalDate
import play.api.Play.current
import play.api.libs.json.{JsArray, JsNumber, JsValue}
import play.api.libs.ws.WS
import play.api.{Logger, Play}
import scala.concurrent.ExecutionContext.Implicits.global
object FlickrSearchActorPublisher {
val apiKey = Play.current.configuration.getString("flickr.apikey")
val flickrUserId = Play.current.configuration.getString("flickr.userId")
val flickrPhotoSearchUrl = s"https://api.flickr.com/services/rest/?method=flickr.photos.search&api_key=$apiKey&user_id=$flickrUserId&min_taken_date=%s&max_taken_date=%s&format=json&nojsoncallback=1&per_page=500&page="
def byDate(from: LocalDate): Props = {
val fromSeconds = from.toDateTimeAtStartOfDay.getMillis / 1000
val toSeconds = from.plusDays(1).toDateTimeAtStartOfDay.getMillis / 1000
val url = flickrPhotoSearchUrl format (fromSeconds, toSeconds)
Props(new FlickrSearchActorPublisher(url))
}
}
class FlickrSearchActorPublisher(url: String) extends ActorPublisher[JsValue] {
var currentPage = 1
var numPages = 1
var photos = Seq[JsValue]()
def searching: Receive = {
case Request(count) =>
Logger.debug(s"Received Request for $count results from Subscriber, ignoring as we are still searching")
case Cancel =>
Logger.info("Cancel Message Received, stopping")
context.stop(self)
case _ =>
}
def accepting: Receive = {
case Request(count) =>
Logger.debug(s"Received Request for $count results from Subscriber")
sendSearchResults()
case Cancel =>
Logger.info("Cancel Message Received, stopping")
context.stop(self)
case _ =>
}
def getNextPageOrStop() {
if (currentPage > numPages) {
Logger.debug("No more pages, stopping")
onCompleteThenStop()
} else {
val pageUrl = url + currentPage
Logger.debug("Flickr search request: " + pageUrl)
context.become(searching)
WS.url(pageUrl).get().map { response =>
val json = response.json
val photosThisPage = (json \ "photos" \ "photo").as[JsArray]
numPages = (json \ "photos" \ "pages").as[JsNumber].value.toInt
Logger.debug(s"page $currentPage of $numPages")
Logger.debug(s"photos this page: ${photosThisPage.value.size}")
photos = photosThisPage.value.seq
if (photos.isEmpty) {
Logger.debug("No photos found, stopping")
onCompleteThenStop()
} else {
currentPage = currentPage + 1
sendSearchResults()
context.become(accepting)
}
}
}
}
def sendSearchResults() {
if (photos.isEmpty) {
getNextPageOrStop()
} else {
while(isActive && totalDemand > 0) {
onNext(photos.head)
photos = photos.tail
if (photos.isEmpty) {
getNextPageOrStop()
}
}
}
}
getNextPageOrStop()
val receive = searching
}
I need do some requests to different URLs, get data from their responses and put this info in one list, but i have some misunderstanding in this theme.
1) for one request i do
def doRequest: Future[WSResponse] = {
client
.url("MY_URL")
.withRequestTimeout(5000)
.get()}
Then I parse json in response to List of my objects:
def list: Future[List[FoobarEntry]] = {
doRequest.map {
response => {
val json = response.json \ "foobar"
json.validate[List[FoobarEntry]] match {
case js:JsSuccess[List[FoobarEntry]]=>
js.get
case e:JsError => Logger.error(JsError.toFlatJson(e).toString()); List()
}
}
}}
I think that for several url i should write some look like
def doRequests: List[Future[WSResponse]] = {
List(client
.url("URL_1")
.withRequestTimeout(5000)
.get(),
client
.url("URL_2")
.withRequestTimeout(5000)
.get())}
But how parse this list of Future[WSResponse] like my def list: Future[List[FoobarEntry]]?
Since you put the future responses inside of a list, you will have to map each future response with the parsing logic that turns it into a FoobarEntry. Like so:
val responseFutures: List[Future[WSResponse]] = ???
val foobarFutures: List[Future[FoobarEntry]] =
responseFutures.map(future => future.map(response => parse(response)))
Now you have a list of future parsed responses, but to do something when all of them has arrived you will need to sequence that list:
val futureFoobars = Future.sequence(foobarFutures)
So, sequence helps you to get from C[Future[A]] to Future[C[A]]
Use for-comprehensions.
val request1 = client.url("URL_1").withRequestTimeout(5000).get()
val request2 = client.url("URL_2").withRequestTimeout(5000).get()
val result: Future[List[FoobarEntry]] = for {
res1: WSResponse <- request1
res2: WSResponse <- request2
} yield List(res1, res2).map(parse).flatten
def parse(response: WSResponse): List[FoobarEntry] = {
val json = response.json \ "foobar"
json.validate[List[FoobarEntry]] match {
case js:JsSuccess[List[FoobarEntry]]=>
js.get
case e:JsError => Logger.error(JsError.toFlatJson(e).toString())
List()
}
Receiving a Gzipped response from an API, but Dispatch 0.9.5 doesn't appear to have any methods to decode the response. Any ideas?
Here's my current implementation, the println only prints out string representations of bytes.
Http(
host("stream.gnip.com")
.secure
.addHeader("Accept-Encoding", "gzip")
/ gnipUrl
> as.stream.Lines(println))()
Tried to look at implementing my own handler, but not sure where to begin. Here's the relevant file for Lines: https://github.com/dispatch/reboot/blob/master/core/src/main/scala/as/stream/lines.scala
Thanks!
Simply abandoned Dispatch and used Java APIs directly. Disappointing, but it got the job done.
val GNIP_URL = isDev match {
case true => "https://url/apath/track/dev.json"
case false => "https://url/path/track/prod.json"
}
val GNIP_CHARSET = "UTF-8"
override def preStart() = {
log.info("[tracker] Starting new Twitter PowerTrack connection to %s" format GNIP_URL)
val connection = getConnection(GNIP_URL, GNIP_USER, GNIP_PASSWORD)
val inputStream = connection.getInputStream()
val reader = new BufferedReader(new InputStreamReader(new StreamingGZIPInputStream(inputStream), GNIP_CHARSET))
var line = reader.readLine()
while(line != null){
println(line)
line = reader.readLine()
}
}
private def getConnection(urlString: String, user: String, password: String): HttpURLConnection = {
val url = new URL(urlString)
val connection = url.openConnection().asInstanceOf[HttpURLConnection]
connection.setReadTimeout(1000 * 60 * 60)
connection.setConnectTimeout(1000 * 10)
connection.setRequestProperty("Authorization", createAuthHeader(user, password));
connection.setRequestProperty("Accept-Encoding", "gzip")
connection
}
private def createAuthHeader(username: String, password: String) = {
val encoder = new BASE64Encoder()
val authToken = username+":"+password
"Basic "+encoder.encode(authToken.getBytes())
}
Used GNIP's example: https://github.com/gnip/support/blob/master/Premium%20Stream%20Connection/Java/StreamingConnection.java
This isn't so much a solution as a workaround, but I ended up resorting to bypassing the Future-based stuff and doing:
val stream = Http(req OK as.Response(_.getResponseBodyAsStream)).apply
val result =
JsonParser.parse(
new java.io.InputStreamReader(
new java.util.zip.GZIPInputStream(stream)))
I'm using JsonParser here because in my case the data I'm receiving happens to be JSON; substitute with something else in your use case, if needed.
My solution just defined a response parser and also adopted json4s parser:
object GzipJson extends (Response => JValue) {
def apply(r: Response) = {
if(r.getHeader("content-encoding")!=null && r.getHeader("content-encoding").equals("gzip")){
(parse(new GZIPInputStream(r.getResponseBodyAsStream), true))
}else
(dispatch.as.String andThen (s => parse(StringInput(s), true)))(r)
}
}
So that I can use it to extract Gzip Json response as the following code:
import GzipJson._
Http(req OK GzipJson).apply
Try >> instead of >
See https://github.com/dispatch/dispatch/blob/master/core/src/main/scala/handlers.scala#L58