How do I turn a cats IO into a Effect using http4s - scala

I've got some code that returns an IO but I need a Effect in http4s.
import cats.effect.{Effect, IO}
class Service[F[_]: Effect] extends Http4sDsl[F] {
val service: HttpService[F] = {
HttpService[F] {
case GET -> Root =>
val data: IO[String] = getData()
data.map(d => Ok(d))
}
}
}
gives
[error] found : cats.effect.IO[F[org.http4s.Response[F]]]
[error] required: F[org.http4s.Response[F]]
[error] data.map(d => Ok(d))
[error] ^

One way we can get around using a concrete IO[A] is using LiftIO[F]:
class Service[F[_]: Effect] extends Http4sDsl[F] {
val service: HttpService[F] = {
HttpService[F] {
case GET -> Root =>
getData().liftIO[F].flatMap(Ok(_))
}
}
}
LiftIO lifts will lift: IO[A] => F[A], but this yields us F[F[Response[F]. In order to get things to compile, we'll flatten on F since it has a Monad (or FlatMap) instance in cats due to our Effect context bounds requirement.
If we want more detail, this is the -Xprint:typer result:
cats.implicits.catsSyntaxFlatten[F, org.http4s.Response[F]](
cats.effect.LiftIO.apply[F](Service.this.evidence$1)
.liftIO[F[org.http4s.Response[F]]](
data.map[F[org.http4s.Response[F]]](
((d: String) => Service.this.http4sOkSyntax(Service.this.Ok)
.apply[String](d)(Service.this.evidence$1,
Service.this.stringEncoder[F](
Service.this.evidence$1, Service.this.stringEncoder$default$2[F]))))))(Service.this.evidence$1).flatten(Service.this.evidence$1)
And at the end of the world when you want to give a concrete effect, for example Service[IO], we get:
val serv: Service[cats.effect.IO] =
new Service[cats.effect.IO]()(effect.this.IO.ioConcurrentEffect)
Where ioConcurrentEffect is the Effect[IO] instance.
It seems that all the good stuff is defined at org.http4s.syntax.AllSyntax trait.

Related

Returning a Stream of Entities using a Transactor inside a Resource in Doobie

I'm trying to implement a query that returns the extracted information inside a fs2.Stream. I defined the Jobs algebra:
trait Jobs[F[_]] {
def all(): fs2.Stream[F, Job]
}
Then, I implemented an interpreter for the algebra:
final class LiveJobs[F[_]: MonadCancelThrow](postgres: Resource[F, Transactor[F]]) extends Jobs[F] {
override def all(): fs2.Stream[F, Job] = for {
jobs <- postgres.use { xa =>
sql"SELECT * FROM jobs".query[Job].stream.transact(xa)
}
} yield jobs
}
However, the compiler yells because the types are not aligned:
type mismatch;
[error] found : fs2.Stream[[_]F[_],Job]
[error] required: F[?]
[error] sql"SELECT * FROM jobs".query[Job].stream.transact(xa)
[error] ^
[error] one error found
The Resource.use method needs a function that produces an F[*], not an fs2.Stream[F, Job]. I cannot find anything that lets me convert between the two types or a different way to use the postgres resource.
The following is probably the design you want to follow:
trait Jobs[F[_]] {
def all: fs2.Stream[F, Job] =
}
object Jobs {
// I am not exactly sure which typeclass you require here, so i will use Async
def live[F[_]](implicit ev: Async[F]): Resource[F, Jobs[F]] = {
val transactor: Resource[F, Transactor[F]] = ... // Whatever you already have here.
transactor.map(xa => new LiveJobs(xa))
}
}
private[pckg] final class LiveJobs[F[_]](xa: Transactor[F])(implicit ev: MonadCancelThrow[F]) extends Jobs[F] {
override final val all: fs2.Stream[F, Job] =
sql"SELECT * FROM jobs".query[Job].stream.transact(xa)
}
Also, my personal advice, stick to concrete IO while learning; and maybe even after.
The whole F[_] thing will just cause more trouble than worth originally.

Akka Http and Circe, how to serialize entities wrapped in the tagless final approach (Cats Effect)

I'm building a toy project to learn Scala 3 and i'm stuck in one problem, first of all i'm following the tagless-final approach using cats-effect, the approach is working as expected except for the entity serialization, when i try to create a route using akka-http i have the following problem:
def routes: Route = pathPrefix("security") {
(path("auth") & post) {
entity(as[LoginUserByCredentialsCommand]) {
(command: LoginUserByCredentialsCommand) =>
complete {
login(command)
}
}
}}
F[
Either[com.moralyzr.magickr.security.core.errors.AuthError,
com.moralyzr.magickr.security.core.types.TokenType.Token
]
]
Required: akka.http.scaladsl.marshalling.ToResponseMarshallable
where: F is a type in class SecurityApi with bounds <: [_] =>> Any
For what i understood, akka-http does not know how to serialize the F highly-kinded type, by searching a little bit i found the following solution, it consists of creating an implicit called marshallable to show the akka-http how to serialize the type, however when i implement it i get a StackOverflow error :(
import akka.http.scaladsl.marshalling.ToResponseMarshaller
import cats.effect.IO
trait Marshallable[F[_]]:
def marshaller[A: ToResponseMarshaller]: ToResponseMarshaller[F[A]]
object Marshallable:
implicit def marshaller[F[_], A : ToResponseMarshaller](implicit M: Marshallable[F]): ToResponseMarshaller[F[A]] =
M.marshaller
given ioMarshaller: Marshallable[IO] with
def marshaller[A: ToResponseMarshaller] = implicitly
I'm really stuck right now, does anyone have an idea on how can i fix this problem? The complete code can be found here
Edit: This is the login code
For clarity, here are the class that instantiate the security api and the security api itself
object Magickr extends IOApp:
override def run(args: List[String]): IO[ExitCode] =
val server = for {
// Actors
actorsSystem <- ActorsSystemResource[IO]()
streamMaterializer <- AkkaMaterializerResource[IO](actorsSystem)
// Configs
configs <- Resource.eval(MagickrConfigs.makeConfigs[IO]())
httpConfigs = AkkaHttpConfig[IO](configs)
databaseConfigs = DatabaseConfig[IO](configs)
flywayConfigs = FlywayConfig[IO](configs)
jwtConfig = JwtConfig[IO](configs)
// Interpreters
jwtManager = JwtBuilder[IO](jwtConfig)
authentication = InternalAuthentication[IO](
passwordValidationAlgebra = new SecurityValidationsInterpreter(),
jwtManager = jwtManager
)
// Database
_ <- Resource.eval(
DbMigrations.migrate[IO](flywayConfigs, databaseConfigs)
)
transactor <- DatabaseConnection.makeTransactor[IO](databaseConfigs)
userRepository = UserRepository[IO](transactor)
// Services
securityManagement = SecurityManagement[IO](
findUser = userRepository,
authentication = authentication
)
// Api
secApi = new SecurityApi[IO](securityManagement)
routes = pathPrefix("api") {
secApi.routes()
}
akkaHttp <- AkkaHttpResource.makeHttpServer[IO](
akkaHttpConfig = httpConfigs,
routes = routes,
actorSystem = actorsSystem,
materializer = streamMaterializer
)
} yield (actorsSystem)
return server.useForever
And
class SecurityApi[F[_]: Async](
private val securityManagement: SecurityManagement[F]
) extends LoginUserByCredentials[F]
with SecurityProtocols:
def routes()(using marshaller: Marshallable[F]): Route = pathPrefix("security") {
(path("auth") & post) {
entity(as[LoginUserByCredentialsCommand]) {
(command: LoginUserByCredentialsCommand) =>
complete {
login(command)
}
}
}
}
override def login(
command: LoginUserByCredentialsCommand
): F[Either[AuthError, Token]] =
securityManagement.loginWithCredentials(command = command).value
================= EDIT 2 =========================================
With the insight provided by Luis Miguel, it makes a clearer sense that i need to unwrap the IO into a Future at the Marshaller level, something like this:
def ioToResponseMarshaller[A: ToResponseMarshaller](
M: Marshallable[IO]
): ToResponseMarshaller[IO[A]] =
Marshaller.futureMarshaller.compose(M.entity.unsafeToFuture())
However, i have this problem:
Found: cats.effect.unsafe.IORuntime => scala.concurrent.Future[A]
Required: cats.effect.IO[A] => scala.concurrent.Future[A]
I think i'm close! Is there a way to unwrap the IO keeping the IO type?
I managed to make it work! Thanks to #luismiguel insight, the problem was that the Akka HTTP Marshaller was not able to deal with Cats-Effect IO monad, so the solution was an implementation who unwraps the IO monad using the unsafeToFuture inside the marshaller, that way i was able to keep the Tagless-Final style from point to point, here's the solution:
This implicit fetches the internal marshaller for the type
import akka.http.scaladsl.marshalling.ToResponseMarshaller
import cats.effect.IO
trait Marshallable[F[_]]:
def marshaller[A: ToResponseMarshaller]: ToResponseMarshaller[F[A]]
object Marshallable:
implicit def marshaller[F[_], A: ToResponseMarshaller](implicit
M: Marshallable[F]
): ToResponseMarshaller[F[A]] = M.marshaller
given ioMarshallable: Marshallable[IO] with
def marshaller[A: ToResponseMarshaller] = CatsEffectsMarshallers.ioMarshaller
This one unwraps the IO monad and flatMaps the marshaller using a future, which akka-http knows how to deal with.
import akka.http.scaladsl.marshalling.{
LowPriorityToResponseMarshallerImplicits,
Marshaller,
ToResponseMarshaller
}
import cats.effect.IO
import cats.effect.unsafe.implicits.global
trait CatsEffectsMarshallers extends LowPriorityToResponseMarshallerImplicits:
implicit def ioMarshaller[A](implicit
m: ToResponseMarshaller[A]
): ToResponseMarshaller[IO[A]] =
Marshaller(implicit ec => _.unsafeToFuture().flatMap(m(_)))
object CatsEffectsMarshallers extends CatsEffectsMarshallers

scalamock create basic setup and easily change it

I started writing my own MockSetter, to make some basic mocks before each tests, on in every of it be able to change of of them or add new one.
Trait for Spec looks like:
trait MocksSetter extends MockFactory with BeforeAndAfterEach { self: Suite =>
type PositionWithMockTuple = (Int, (() => CallHandler[Any]))
private var standardMocks: Seq[PositionWithMockTuple] = Seq.empty
def setupStandardMocks(mocks: Seq[PositionWithMockTuple]) = {
standardMocks = mocks
}
//Replaces current standard mock in a given position
def replaceMock(position: Int, newMockedFunc: () => CallHandler[Any]) = {
val updatedMocks = standardMocks.map {
case (pos, _) if pos == position => (pos, newMockedFunc)
case (pos, mf) => (pos, mf)
}
standardMocks = updatedMocks
}
//Puts new mock in the end
def insertMock(newMockedFunc: () => CallHandler[Any]) = {
standardMocks = standardMocks :+ ((standardMocks.last._1 + 1, newMockedFunc))
}
override def beforeEach(): Unit = {
standardMocks.foreach {
case (_, mockFunc) => mockFunc().once()
}
super.beforeEach()
}
}
but when I try to pass mocks:
[error] found : org.scalamock.handlers.CallHandler[Boolean]
[error] required: org.scalamock.handlers.CallHandler[Any]
[error] Note: Boolean <: Any, but class CallHandler is invariant in type R.
[error] You may wish to define R as +R instead. (SLS 4.5)
[error] setupStandardMocks(Seq((1, () => someMethodMock(false))))
Is it possible to achieve this goal ?
My idea is simple, before all tests save basic mocks, which are executed before each test ( by using .once() ). Of course different specs may differ between each other, and when doing tests there is a lot of code repeated in every test. If there are for example 10 methods inside, for each test I need to write mocks for every method, and change only what is needed.
Would be nice to make it using basic setup and change according to specific test.
Ok solution can be done in easier way.
In spec we can create abstract class with parameters.
We will run spec by doing
"x y z" in new StandardMocks(onlyOneParamChanged = "xyz") {
...
}
Of course class parameters will have default values, which we can override easily.
Inside class do the mocking stuff by takich parameters and thats it.

required: play.api.mvc.Request[?] => play.api.mvc.Result

I am migrating to Play 2.6 and have the following API wrapper functions that used to work:
trait API {
self: Controller =>
def api(businessLogic: Request[AnyContent] => Any): Action[AnyContent] = apiWithBody(parse.anyContent)(businessLogic)
def apiWithBody[A](bodyParser: BodyParser[A])(businessLogic: Request[A] => Any): Action[A] = Action(bodyParser) {
implicit request =>
{
val apiResult = businessLogic(request)
val start = new java.util.Date().getTime
val actionDuration = (new java.util.Date().getTime - start)
val response = resultFrom(apiResult, request, actionDuration) // Returns a Result
response
}
}
}
Called by Controller functions like:
object Accounts extends Controller with API {
def all = superUser {
implicit principal =>
api {
request =>
models.Account.all
}
}
}
Where superUser is the principal (user) type "admin".
And get the following compiler error:
[error] type mismatch;
[error] found : play.api.mvc.Action[play.api.mvc.AnyContent]
[error] required: play.api.mvc.Request[?] => play.api.mvc.Result
[error] api {
[error] ^
I'm building with sbt 1.1.5 and Scala 2.11.8.
I am guessing the [?] means the compiler doesn't know what type is required but I don't understand what is wrong. I have searched for this issue but not found the specific answer for this problem.
In addition I'm getting an error:
[error] could not find implicit value for parameter parser: play.api.mvc.BodyParser[Any]
[error] def all = superUser {
[error] ^
that I posted as a separate issue (see could not find implicit value for parameter parser: play.api.mvc.BodyParser[Any]) but might be relevant here?
def superUser[A](f: => Principal => Request[A] => Result)(implicit parser: BodyParser[A]): SecureAction[A] = {
_superUser {
user =>
implicit val principal = data.Principal(user)
Action(parser)(request => f(principal)(request))
}
}
private def _superUser[A](action: String => Action[A]) = {
play.api.mvc.Security.Authenticated(getSuperUser, onUnauthorized)(action)
}
Any help would be appreciated.
Sorry I'm little bit confused here about the architecture as:
Where is the API call? Is it within the model? Are you calling outside the current Play app? Why don't you use the Future API for it? Because then you can recover it. Which then help you with logging and error handling.
The all method get the request, and then it does not return an HTTP response. Why don't you pass on what you need from the request (e.g., the Cookie).
I think the answer to your question is to use your action composition with action. Something like:
def myMethod(queryParam: String) = myDefinedAction compose Action { ??? }
//In case you want to use the Future API, then you need to be async
def myMethod(queryParam: String) = (myDefinedAction compose Action).async {
implicit request =>
apiCall.map{
case _ => Ok("good request")
}.recover{case _ => BadRequest}
}

Scala Future strange compile error

Code below is a simplified version of the real code. We "inherited" the domain model case object FutTest and case class FutTest, which we can't modify. The actual domain models are served from a Database, so I believe the Future approach is valid, but it causes problems which I don't understand.
import org.scalatest.FunSpec
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
case object FutTest {
def create(sz: Int) = { FutTest(sz) }
}
case class FutTest(size: Int)
class FutureTest extends FunSpec {
def one(v: Int): Future[FutTest] = {
Future { FutTest.create(v) }
}
def two(t: FutTest) = {
Future { FutTest.create(t.size) }
}
def compileError1: Future[FutTest] = {
one(10).map(f => two(f))
}
def compileError2: Future[FutTest] = {
for { o <- one(10) } yield (two(o))
}
}
The error messages:
[INFO] Using incremental compilation
[INFO] Compiling 7 Scala sources and 5 .. target/test-classes...
[ERROR] domain.FutureTest.scala:25: type mismatch;
found : scala.concurrent.Future[domain.FutTest]
required: domain.FutTest
[ERROR] one(10).map(f => two(f))
[ERROR] ^
[ERROR] domain/FutureTest.scala:29: type mismatch;
found : scala.concurrent.Future[domain.FutTest]
required: domain.FutTest
[ERROR] for { o <- one(10) } yield (two(o))
I tried the above code with plain Int instead of FutTest and all is fine. Why is the compiler complaining and how can we solve this without touching the existing domain.
flatMap is what you want.
one(10).flatMap(f => two(f))
or
one(10).flatMap(two)
Using for comprehension,
for { o <- one(10); t <- two(o) } yield t
One() returns a Future and two() also returns a Future so you need to flatMap instead of map. When you map to two(), your result is Future[Future[FutTest]] and needs to be flattened.
Doing
one(10).flatMap(f => two(f))
should do the trick.