Lagom - Add header to response in service call composition - scala

i want to write code which will refresh cookie (via set-cookie http header) in Lagom.
For clarification Cookie is an encoded string (for example AES).
Lets take lagom service call composition for authentication from Implementing services and edit it
def authenticated[Request, Response](
serviceCall: User => ServerServiceCall[Request, Response]
) = ServerServiceCall.composeAsync { requestHeader =>
//Get cookie from header and decode it
val cookie = decodeCookie(requestHeader)
//Get user based on cookie decode function
val userLookup = getUser(cookie)
userLookup.map {
case Some(user) =>
serviceCall(user)
case None => throw Forbidden("User must be authenticated")
}
}
It is possible to manipulate serviceCall(user) response headers?
I tried something like this:
serviceCall( employee ).handleResponseHeader { case (responseHeader, response) =>
responseHeader.withHeader("Set-Cookie",encodeCookie("NewCookieStringExample")) // Add header
response
}
But function handleResponseHeader requires only response[T] as result and header would not be changed because is immutable.
I know that i can pass cookie to serviceCall(user) and in every service call implementation return tuple with ResponseHeader and Response but this will impact all endpoints and will add much more code.
Use of HeaderFilter is possible too but this would decode and encode cookie twice in one request (in header filter and in authenticated service call composition )
Any tips?

You can return the modified Header and the Response as a Tuple:
serviceCall( employee ).handleResponseHeader((responseHeader, response) => {
val modifiedHeader = responseHeader.withHeader("Set-Cookie",encodeCookie("NewCookieStringExample")) // Add header
(modifiedHeader, response)
})
By the way, The provided example does not compile for me. composeAsync wants a Future[ServerServiceCall[...]].

Related

Handling and manipulating a list of http responses

I'm currently trying to implement API logic to fetch multiple images from a server.
This server accepts an image id and return an HTTP response that contains the image in PNG format as an entity.
Right now, we want to add a new endpoint that accepts a list of images IDs and return a list of all the images:
I have done the following:
def getImagesFromIds(IdsList: List[String]): Future[List[HttpResponse]] = {
Future.sequence {
IdsList.map(
id => getImageById(id)
)
}
}
this function will receive a list of ids and will call the getImageById to fetch all the images, it will return a list of HttpResponse.
And for the route definition, I have done the following:
def getImagesByIdsListRoute: Route = get {
path("by-ids-list") {
entity(as[List[String]]){
upcs =>
complete(getImagesFromIds(upcs))
}
}
}
But I'm getting the following error message:
no implicits found for parameter m: marshalling.toresponsemarshallable[list[httpresponse]]
Does Any one know how we can marshall a list of http responses, or if there is any way to improve this logic to fetch multiple http responses ?
If I understand correctly, you want to download multiple images and return them as a HTTP response.
The problems with your current attempt
The call to the API made via getImageById returns a HttpResponse. You can't be sure what is the result of this API call. If it fails, the response won't contain any image at all.
You are trying to return List[HttpResponse] as your response. How should this response be serialized? Akka doesn't know what you mean by that and tries to find a marshaller which will serialize your object (for example to JSON) but can't find one.
Returning a list of images requires zipping them. You can't return multiple entities in a single HTTP response.
Possible approach
You have to change getImageById so that it checks what is in the HttpResponse and returns the entity bytes.
Example:
response match {
case HttpResponse(StatusCodes.OK, _, entity, _) =>
entity.dataBytes
case resp # HttpResponse(code, _, _, _) =>
// Response failed and we don't care about the response entity
// Details: https://doc.akka.io/docs/akka-http/current/implications-of-streaming-http-entity.html
resp.discardEntityBytes()
// Decide yourself how you want to handle failures
throw new RuntimeException("Request failed, response code: " + code)
}
dataBytes returns a Source so you'll end up with a List of Sources. You have to concatenate them via, for example via concat.
The result stream has to be zipped via Compression.gzip.
Finally, the stream can be put in the complete method of getImagesByIdsListRoute.

Retrieve cookie value with Gatling

I'm kinda new to Gatling and I'd like to get the value from a cookie. I tried many ways to do so but I might misunderstand something.
At first I'm doing a post request to my auth API which create the cookie I want.
Then I've tried :
.exec {
session => println(session)
println(session.attributes)
// return a Some object whose value is of type CookieJar (with apparently private access)
println(session.attributes.get("gatling.http.cookies"))
/*
// Doesn't compile due to CookieJar being private
val value: CookieJar = session.attributes.get("gatling.http.cookies") match {
case None => None
case Some(cj: CookieJar) => cj
}
print(value)
*/
// return a GetCookieBuilder which doesn't seem really useful
println(getCookieValue(CookieKey("COOKIE_NAME")))
session
}
Do you have any idea about it ?
getCookieValue is a DSL component, not a method you can call in your own functions.
It's used as a scenario step to extract a cookie value from the internal CookieJar and copy it in the Session as a dedicated attribute.
exec(getCookieValue(CookieKey("COOKIE_NAME")))
.exec { session =>
println(session("COOKIE_NAME").as[String])
session
}

Get cookies in middleware in http4s?

I'm trying to write middleware that would extract specific cookie and store information in ContextRequest.
Here is my test code:
def cookie[F[_]: Sync](
logger: Logger[F]
): Kleisli[F, Request[F], ContextRequest[F, Option[Cookie]]] =
Kleisli { request: Request[F] =>
for {
_ <- logger.debug(s"finding cookie")
_ <- logger.debug(request.cookies.map(_.name).mkString(","))
} yield ContextRequest(none[Cookie], request)
}
Then I use it like this:
def httpApp: HttpApp[F] = cookie(logger).mapK(OptionT.liftK).andThen(routesWithCookieContext).orNotFound
The problem is: request doesn't have any cookies even so I see them in the Chrome dev tools and in the request's details in the logs. What I'm doing wrong and how to make it work?
Turned out it was the problem with a cookie content. I was using Circle's .asJson.noSpaces to convert case class into string and write it into cookie's value. But for some reason cookies with json in their value doesn't work.

How to do authentication using Akka HTTP

Looking for a good explanation on how to do authentication using akka HTTP. Given a route that looks like
val route =
path("account") {
authenticateBasic(realm = "some realm", myAuthenticator) { user =>
get {
encodeResponseWith(Deflate) {
complete {
//do something here
}
}
}
}
}
The documentation outlines a way, but then the pertinent part performing the actual authentication is omitted...
// backend entry points
def myAuthenticator: Authenticator[User] = ???
Where can I find an example implementation of such an authenticator? I have the logic already for authenticating a user given a user name and password, but what i can't figure out is how to get a username/password (or token containing both) from the HTTP request (or RequestContext).
Authenticator is just a function UserCredentials => Option[T], where UserCredentials in case of being (check with pattern matching) Provided have verifySecret(secret) method which you need to safely call and return Some (Some user for example) in case of success, like:
def myAuthenticator: Authenticator[User] = {
case p#Provided(username) =>
if(p.verifySecret(myGetSecret(username))) Some(username) else None
case Missing => None //you can throw an exeption here to get customized response otherwise it will be regular `CredentialsMissing` message
}
myGetSecret is your custom function which gets username and returns your secret (e.g. password), getting it possibly from database. verifySecret will securely compare (to avoid timing attack) provided password with your password from myGetSecret. Generally, "secret" is any hidden information (like hash of credentials or token) but in case of basic authentication it is just a plain password extracted from http headers.
If you need more customized approach - use authenticateOrRejectWithChallenge that gets HttpCredentials as an input, so you can extract provided password from there.
More info about authorization is in scaladocs.

Parse request body with custom content-type to json in Play Framework

I'm using play to proxy my API calls from the ui. For example POST:
def post(url: String) = Action { implicit request =>
Async {
WS.url(proxyUrl + request.uri)
.withQueryString(request.queryString.mapValues(_.head).toSeq: _*)
.withHeaders(request.headers.toMap.mapValues(_.head).toSeq: _*)
.post(request.body.asJson).map(response => Ok(response.body))
}
}
but this can only handle "application/json" and "text/json" content types. But now I want to make requests with custom content type: "application/vnd.MyCustomType-v1+json;charset=utf-8" and of course it doesn't work with current implementation. Have tried different solutions, but nothing seems to work. Any ideas?
I'm using play 2.1
The source for the json body parser looks like this:
def json(maxLength: Int): BodyParser[JsValue] = when(
_.contentType.exists(m => m.equalsIgnoreCase("text/json") || m.equalsIgnoreCase("application/json")),
tolerantJson(maxLength),
createBadResult("Expecting text/json or application/json body")
)
tolerantJson is, itself, a body parser that does the json parsing without a check of the content-type header, so you should just be able to use that to parse your request instead of parse.json.
If you want to go further and have a parser that checks your specific content-type header then you could use
when(
_.contentType.exists(m => m.equalsIgnoreCase(expectedContentType)),
tolerantJson(maxLength),
createBadResult("Wrong content type")
)
to create your own parser.