Use config value from MainActor class in other class - scala

I'm using Akka in my project and pull config values in my MainActor class. I want to be able to use commit, version, author tag inside of another file in order to build an avro response, but I can't just simply make MainActor the parent class of my Avro response interface. Is there a workaround?
My MainActor class
class MainActor extends Actor with ActorLogging with ConfigComponent with ExecutionContextComponent with DatabaseComponent with DefaultCustomerProfiles {
override lazy val config: Config = context.system.settings.config
override implicit lazy val executionContext: ExecutionContext = context.dispatcher
override val db: Database = Database.fromConfig(config.getConfig("com.ojolabs.customer-profile.database"))
private val avroServer = context.watch {
val binding = ReflectiveBinding[CustomerService.Async](customerProfileManager)
val host = config.getString("com.ojolabs.customer-profile.avro.bindAddress")
val port = config.getInt("com.ojolabs.customer-profile.avro.port")
context.actorOf(AvroServer.socketServer(binding, host, port))
}
val commit = config.getString("com.ojolabs.customer-profile.version.commit")
val author = config.getString("com.ojolabs.customer-profile.version.author")
val tag = config.getString("com.ojolabs.customer-profile.version.tag")
val buildId = config.getString("com.ojolabs.customer-profile.version.buildId")
override def postStop(): Unit = {
db.close()
super.postStop()
}
//This toplevel actor does nothing by default
override def receive: Receive = Actor.emptyBehavior
}
The class I want to pull values into
trait DefaultCustomerProfiles extends CustomerProfilesComponent {
self: DatabaseComponent with ExecutionContextComponent =>
lazy val customerProfileManager = new CustomerService.Async {
import db.api._
override def customerById(id: String): Future[AvroCustomer] = {
db.run(Customers.byId(UUID.fromString(id)).result.headOption)
.map(_.map(AvroConverters.toAvroCustomer).orNull)
}
override def customerByPhone(phoneNumber: String): Future[AvroCustomer] = {
db.run(Customers.byPhoneNumber(phoneNumber).result.headOption)
.map(_.map(AvroConverters.toAvroCustomer).orNull)
}
override def findOrCreate(phoneNumber: String, creationReason: String): Future[AvroCustomer] = {
db.run(Customers.findOrCreate(phoneNumber, creationReason)).map(AvroConverters.toAvroCustomer)
}
override def createEvent(customerId: String, eventType: String, version: Double, data: String, metadata: String): Future[AvroCustomerEvent] = {
val action = CustomerEvents.create(
UUID.fromString(customerId),
eventType,
Json.parse(data),
version,
Json.parse(metadata)
)
db.run(action).map(AvroConverters.toAvroEvent)
}
override def getVersion() : Version = {
}
}

Create another trait that defines the values, and mix it in with your MainActor and DefaultCustomerProfiles traits.
trait AnvroConfig {
self: ConfigComponent
val commit = config.getString("com.ojolabs.customer-profile.version.commit")
val author = config.getString("com.ojolabs.customer-profile.version.author")
val tag = config.getString("com.ojolabs.customer-profile.version.tag")
val buildId = config.getString("com.ojolabs.customer-profile.version.buildId")
}

I think what you really need is an Akka Extension, which enables you to add features, like custom config, to your Akka system in an elegant way. This way, you would have access to those config values within all your actors from the actor system. As an example, check out this nice blog post.
As for the other class from your example, you should pass them as parameters - it should be concerned with retrieving and parsing the config itself.

Related

Starting two Scala Finagle ListeningServers at once

I need to start two Finagle ListeningServers at once because I have to implement two different traits that extend ListeningServer.
/* A simplified example to give you an idea of what I'm trying to do */
trait FirstListeningServer extends ListeningServer {
def buildFirstServer() = ???
def main(): Unit = {
val server = buildFirstServer()
closeOnExit(server)
Await.ready(server)
}
}
trait SecondListeningServer extends ListeningServer {
def buildSecondServer() = ???
def main(): Unit = {
val server = buildSecondServer()
closeOnExit(server)
Await.ready(server)
}
}
Basically, each ListeningServer is a com.twitter.util.Awaitable and whenever I have to instantiate a new ListeningServer I use Await.ready(myListeningServer).
class MyServer extends FirstListeningServer with SecondListeningServer {
override def main(): Unit = {
val firstServer = buildFirstServer()
closeOnExit(firstServer)
val secondServer = buildSecondServer()
closeOnExit(secondServer)
Await.all(firstServer, secondServer)
}
}
Now I'm not sure if using Await.all() is the right choice in order to start several ListeningServers concurrently. I would have used com.twitter.util.Future.collect() but I have two Awaitables.
def all(awaitables: Awaitable[_]*): Unit
Returns after all actions have completed.
I'm using Scala 2.12 and Twitter 20.3.0.

In Play 2.6, how to write a WS Client filter that forwards headers from a parent request?

If I have a controller named HomeController that receives a request like GET /foo with a header X-Foo: Bar, I would like to create a WS client filter that will read the RequestHeader in the context and copy the header value to the outgoing WS request.
Example Controller:
import play.api.libs.ws.{StandaloneWSRequest, WSClient, WSRequest, WSRequestExecutor, WSRequestFilter}
import play.api.mvc._
import scala.concurrent.ExecutionContext
#Singleton
class HomeController #Inject()(cc: ControllerComponents,
myWsClient: MyWSClient)
(implicit executionContext: ExecutionContext)
extends AbstractController(cc) {
def index = Action.async {
myWsClient.url("http://www.example.com")
.get()
.map(res => Ok(s"${res.status} ${res.statusText}"))(executionContext)
}
}
The wrapper around WSClient that introduces the filter:
#Singleton
class MyWSClient #Inject()(delegate: WSClient, fooBarFilter: FooBarFilter) extends WSClient {
override def underlying[T]: T = delegate.underlying.asInstanceOf[T]
override def url(url: String): WSRequest = {
delegate.url(url)
.withRequestFilter(fooBarFilter)
}
override def close(): Unit = delegate.close()
}
And finally the WS filter itself:
#Singleton
class FooBarFilter extends WSRequestFilter {
override def apply(executor: WSRequestExecutor): WSRequestExecutor = {
(request: StandaloneWSRequest) => {
request.addHttpHeaders(("X-Foo", "<...>")) // INSERT CORRECT VALUE HERE!
executor.apply(request)
}
}
}
In the end, the expectation is that the request GET http://www.example.com contains the header X-Foo: Bar.
The special requirements that make this more interesting are:
You can modify the MyWsClient class.
You can modify the FooBarFilter class
You can create HTTP controller filters (play.api.mvc.(Essential)Filterif it helps.
You can create other classes/objects/etc
You cannot modify the controller (because in our situation, we can't expect all existing controllers to be modified.
The solution should work even if there's a a "service" layer between the controller and the WSClient invocation and doesn't involve passing down objects everywhere.
The solution can alter other Play/Akka mechanisms, like the default Dispatcher
I haven't tried to put it into actual code and test if this works but here is an idea: it looks like since Play 2.1 Http.Context is propagated even across async call. And there is Http.Context._requestHeader. So what you can try to do is to change MyWSClient and FooBarFilter like this:
#Singleton
class MyWSClient #Inject()(delegate: WSClient) extends WSClient {
override def underlying[T]: T = delegate.underlying.asInstanceOf[T]
override def url(url: String): WSRequest = {
val fooHeaderOption = Http.Context.current()._requestHeader().headers.get(FooHeaderFilter.fooHeaderName)
val baseRequest = delegate.url(url)
if (fooHeaderOption.isDefined)
baseRequest.withRequestFilter(new FooHeaderFilter(fooHeaderOption.get))
else
baseRequest
}
override def close(): Unit = delegate.close()
class FooHeaderFilter(headerValue: String) extends WSRequestFilter {
import FooHeaderFilter._
override def apply(executor: WSRequestExecutor): WSRequestExecutor = {
(request: StandaloneWSRequest) => {
request.addHttpHeaders((fooHeaderName, headerValue))
executor.apply(request)
}
}
}
object FooHeaderFilter {
val fooHeaderName = "X-Foo"
}
}
The idea is simple: extract the header from the Http.Context.current() when WSRequest is created and attach it to the request using a WSRequestFilter
Update: make it work in Scala API
As it was pointed out in the comment, this approach doesn't work in Scala API because Http.Context is not initialized and is not passed between threads. To make it work a higher level magic is required. Namely you need:
Easy: A Filter that will init Http.Context for Scala-handled requests
Hard: Override ExecutorServiceConfigurator for Akka's default dispatcher to create a custom ExecutorService that will pass Http.Context between thread switches.
The filter is trivial:
import play.mvc._
#Singleton
class HttpContextFilter #Inject()(implicit ec: ExecutionContext) extends EssentialFilter {
override def apply(next: EssentialAction) = EssentialAction { request => {
Http.Context.current.set(new Http.Context(new Http.RequestImpl(request), null))
next(request)
}
}
}
And the add it to the play.filters.enabled in the application.conf
The hard part is something like this:
class HttpContextWrapperExecutorService(val delegateEc: ExecutorService) extends AbstractExecutorService {
override def isTerminated = delegateEc.isTerminated
override def awaitTermination(timeout: Long, unit: TimeUnit) = delegateEc.awaitTermination(timeout, unit)
override def shutdownNow() = delegateEc.shutdownNow()
override def shutdown() = delegateEc.shutdown()
override def isShutdown = delegateEc.isShutdown
override def execute(command: Runnable) = {
val newContext = Http.Context.current.get()
delegateEc.execute(() => {
val oldContext = Http.Context.current.get() // might be null!
Http.Context.current.set(newContext)
try {
command.run()
}
finally {
Http.Context.current.set(oldContext)
}
})
}
}
class HttpContextExecutorServiceConfigurator(config: Config, prerequisites: DispatcherPrerequisites) extends ExecutorServiceConfigurator(config, prerequisites) {
val delegateProvider = new ForkJoinExecutorConfigurator(config.getConfig("fork-join-executor"), prerequisites)
override def createExecutorServiceFactory(id: String, threadFactory: ThreadFactory): ExecutorServiceFactory = new ExecutorServiceFactory {
val delegateFactory = delegateProvider.createExecutorServiceFactory(id, threadFactory)
override def createExecutorService: ExecutorService = new HttpContextWrapperExecutorService(delegateFactory.createExecutorService)
}
}
and register at using
akka.actor.default-dispatcher.executor = "so.HttpContextExecutorServiceConfigurator"
Don't forget to update the "so" with you real package. Also if you use more custom executors or ExecutionContexts, you should patch (wrap) them as well to pass Http.Context along the asynchronous calls.

How to Dependency Inject database in Play 2.5

I'm migrating from play 2.3 to 2.5
Originally I have "DAOFactory" object
object DAOFactory {
def categoryDAO: CategoryDAO = AnormCategoryDAO
def itemDAO: ItemDAO = AnormItemDAO
def bidDAO: BidDAO = AnormBidDAO
def userDAO: UserDAO = AnormUserDAO
def feedStatsDAO: FeedStatsDAO = AnormFeedStatsDAO
}
and let's take "AnormCategoryDAO" as a example, and I have to change the "object" into a "Class"
object AnormCategoryDAO extends CategoryDAO {
val category = {
int("id") ~ str("display_name") ~ str("url_name") map {
case id~displayName~urlName => Category(id, displayName, urlName)
}
}
def create(displayName: String, urlName: String) = DB.withConnection { implicit c =>
SQL("INSERT INTO category(display_name, url_name) VALUES({displayName}, {urlName})").on(
'displayName -> displayName, 'urlName -> urlName).executeUpdate()
}
def findById(id: Int): Option[Category] = DB.withConnection { implicit c =>
SQL("SELECT * FROM category WHERE id = {id}").on('id -> id).as(category singleOpt)
}
def findByName(urlName: String): Option[Category] = DB.withConnection { implicit c =>
SQL("SELECT * FROM category WHERE url_name = {urlName}").on('urlName -> urlName).as(category singleOpt)
}
def all(): List[Category] = DB.withConnection { implicit c =>
SQL("SELECT * FROM category ORDER BY display_name").as(category *)
}
}
So I changed the OBJECT to CLASS and annotated with SINGLETON as below, and I changed "DB.withConnection" to "db.withConnection"
#Singleton
class AnormCategoryDAO #Inject()(db: Database) extends CategoryDAO {
val category = {
int("id") ~ str("display_name") ~ str("url_name") map {
case id~displayName~urlName => Category(id, displayName, urlName)
}
}
...
Now, "AnormCategoryDAO" is a Class. So I need to figure out a way to instantiate it with a default database.
But I don't know how to instantiate it.
object DAOFactory {
//def categoryDAO: CategoryDAO = AnormCategoryDAO
def userDAO: UserDAO = AnormUserDAO
def itemDAO: ItemDAO = AnormItemDAO
}
The question is, how do I inject the database and instantiate it?
I don't like to use guice or similar to di. with compile time di I can achieve that by using something like:
import play.api.db.slick.{DbName, SlickComponents}
trait TablesComponents extends BaseComponent with SlickComponents {
lazy val dbConf = api.dbConfig[JdbcProfile](DbName("default"))
lazy val myTable = new MyTable(dbConf.db)
lazy val otherTable = new OtherTable(dbConf.db)
}
You either have the dependency that is to be injected ready, in which case you may call new AnormCategoryDAO(myDb) directly, or you inject the AnormCategoryDAO wherever it is required (this could mean that dependency injection propagates all the way to the controllers, which are instantiated by Play).
For Example:
class CategoryService #Inject() (categoryDao: CategoryDAO) {
def findAll() = categoryDao.findAll()
}
Note that in this example, I used the abstract type CategoryDAO to refer to the categoryDAO. For this, you'll have to tell the dependency injection framework (typically Guice) which concreate class it should inject (binding). Alternatively, you could depend on AnormCategoryDAO directly.
How you can define custom bindings is documented here: https://www.playframework.com/documentation/2.5.x/ScalaDependencyInjection
Note that there is an alternative approach to dependency injection named compile time: https://www.playframework.com/documentation/2.5.x/ScalaCompileTimeDependencyInjection

Asynchronous Iterable over remote data

There is some data that I have pulled from a remote API, for which I use a Future-style interface. The data is structured as a linked-list. A relevant example data container is shown below.
case class Data(information: Int) {
def hasNext: Boolean = ??? // Implemented
def next: Future[Data] = ??? // Implemented
}
Now I'm interested in adding some functionality to the data class, such as map, foreach, reduce, etc. To do so I want to implement some form of IterableLike such that it inherets these methods.
Given below is the trait Data may extend, such that it gets this property.
trait AsyncIterable[+T]
extends IterableLike[Future[T], AsyncIterable[T]]
{
def hasNext : Boolean
def next : Future[T]
// How to implement?
override def iterator: Iterator[Future[T]] = ???
override protected[this] def newBuilder: mutable.Builder[Future[T], AsyncIterable[T]] = ???
override def seq: TraversableOnce[Future[T]] = ???
}
It should be a non-blocking implementation, which when acted on, starts requesting the next data from the remote data source.
It is then possible to do cool stuff such as
case class Data(information: Int) extends AsyncIterable[Data]
val data = Data(1) // And more, of course
// Asynchronously print all the information.
data.foreach(data => println(data.information))
It is also acceptable for the interface to be different. But the result should in some way represent asynchronous iteration over the collection. Preferably in a way that is familiar to developers, as it will be part of an (open source) library.
In production I would use one of following:
Akka Streams
Reactive Extensions
For private tests I would implement something similar to following.
(Explanations are below)
I have modified a little bit your Data:
abstract class AsyncIterator[T] extends Iterator[Future[T]] {
def hasNext: Boolean
def next(): Future[T]
}
For it we can implement this Iterable:
class AsyncIterable[T](sourceIterator: AsyncIterator[T])
extends IterableLike[Future[T], AsyncIterable[T]]
{
private def stream(): Stream[Future[T]] =
if(sourceIterator.hasNext) {sourceIterator.next #:: stream()} else {Stream.empty}
val asStream = stream()
override def iterator = asStream.iterator
override def seq = asStream.seq
override protected[this] def newBuilder = throw new UnsupportedOperationException()
}
And if see it in action using following code:
object Example extends App {
val source = "Hello World!";
val iterator1 = new DelayedIterator[Char](100L, source.toCharArray)
new AsyncIterable(iterator1).foreach(_.foreach(print)) //prints 1 char per 100 ms
pause(2000L)
val iterator2 = new DelayedIterator[String](100L, source.toCharArray.map(_.toString))
new AsyncIterable(iterator2).reduceLeft((fl: Future[String], fr) =>
for(l <- fl; r <- fr) yield {println(s"$l+$r"); l + r}) //prints 1 line per 100 ms
pause(2000L)
def pause(duration: Long) = {println("->"); Thread.sleep(duration); println("\n<-")}
}
class DelayedIterator[T](delay: Long, data: Seq[T]) extends AsyncIterator[T] {
private val dataIterator = data.iterator
private var nextTime = System.currentTimeMillis() + delay
override def hasNext = dataIterator.hasNext
override def next = {
val thisTime = math.max(System.currentTimeMillis(), nextTime)
val thisValue = dataIterator.next()
nextTime = thisTime + delay
Future {
val now = System.currentTimeMillis()
if(thisTime > now) Thread.sleep(thisTime - now) //Your implementation will be better
thisValue
}
}
}
Explanation
AsyncIterable uses Stream because it's calculated lazily and it's simple.
Pros:
simplicity
multiple calls to iterator and seq methods return same iterable with all items.
Cons:
could lead to memory overflow because stream keeps all prevously obtained values.
first value is eagerly gotten during creation of AsyncIterable
DelayedIterator is very simplistic implementation of AsyncIterator, don't blame me for quick and dirty code here.
It's still strange for me to see synchronous hasNext and asynchronous next()
Using Twitter Spool I've implemented a working example.
To implement spool I modified the example in the documentation.
import com.twitter.concurrent.Spool
import com.twitter.util.{Await, Return, Promise}
import scala.concurrent.{ExecutionContext, Future}
trait AsyncIterable[+T <: AsyncIterable[T]] { self : T =>
def hasNext : Boolean
def next : Future[T]
def spool(implicit ec: ExecutionContext) : Spool[T] = {
def fill(currentPage: Future[T], rest: Promise[Spool[T]]) {
currentPage foreach { cPage =>
if(hasNext) {
val nextSpool = new Promise[Spool[T]]
rest() = Return(cPage *:: nextSpool)
fill(next, nextSpool)
} else {
val emptySpool = new Promise[Spool[T]]
emptySpool() = Return(Spool.empty[T])
rest() = Return(cPage *:: emptySpool)
}
}
}
val rest = new Promise[Spool[T]]
if(hasNext) {
fill(next, rest)
} else {
rest() = Return(Spool.empty[T])
}
self *:: rest
}
}
Data is the same as before, and now we can use it.
// Cool stuff
implicit val ec = scala.concurrent.ExecutionContext.global
val data = Data(1) // And others
// Print all the information asynchronously
val fut = data.spool.foreach(data => println(data.information))
Await.ready(fut)
It will trow an exception on the second element, because the implementation of next was not provided.

lazy loading some config params, trying to come up with a pattern in Scala

I want my client code to look somewhat like this:
val config:Config = new MyConfig("c:/etc/myConfig.txt")
println(config.param1)
println(config.param2)
println(config.param3)
Which means that:
The Config interface defines the config fields
MyConfig is a Config implementation -- all the wiring needed is the instantiation of the desired implementation
Data is loaded lazily -- it should happen on first field reference (config.param1 in this case)
So, I want the client code to be friendly, with support for interchangeable implementations, with statically typed fields, hiding lazy loading. I also want it to be as simple as possible for making alternative implementations, so Config should somewhat guide you.
I am not satisfied with what I came up with so far:
trait Config {
lazy val param1:String = resolveParam1
lazy val param2:String = resolveParam2
lazy val param3:Int = resolveParam3
protected def resolveParam1:String
protected def resolveParam2:String
protected def resolveParam3:Int
}
class MyConfig(fileName:String) extends Config {
lazy val data:Map[String, Any] = readConfig
// some dummy impl here, should read from a file
protected def readConfig:Map[String,Any] = Map[String, Any]("p1" -> "abc", "p2" -> "defgh", "p3" -> 43)
protected def resolveParam1:String = data.get("p1").get.asInstanceOf[String]
protected def resolveParam2:String = data.get("p2").get.asInstanceOf[String]
protected def resolveParam3:Int = data.get("p3").get.asInstanceOf[Int]
}
I'm sure there are better solutions, that's where you can help :)
One thing I especially don't like here is that MyConfig defines an intermediate container with some arbitrary keys, and since it is Map[String, Any], I need to cast the values.
There's nothing preventing you from just making the values abstract. You cannot enforce laziness in the super-trait, but that's ok since lazy-loading is really an implementation detail anyway:
trait Config {
val param1: String
val param2: String
val param3: Int
}
class MyConfig extends Config {
lazy val param1 = readConfig().("p1")
...
def readConfig(): Map[String, String] = ...
}
On a stylistic note, readConfig() should be declared and called with parens (rather than without) as it is a side-effecting method. The no-parens syntax is designed to denote pure-functional methods.
If you just want to simplify it, the "param" fields could be methods...
trait Config {
def param1:String
def param2:String
def param3:Int
}
class MyConfig(fileName:String) extends Config {
lazy val data:Map[String, Any] = readConfig
// some dummy impl here, should read from a file
protected def readConfig:Map[String,Any] =
Map[String, Any]("p1" -> "abc", "p2" -> "defgh", "p3" -> 43)
def param1:String = data.get("p1").get.asInstanceOf[String]
def param2:String = data.get("p2").get.asInstanceOf[String]
def param3:Int = data.get("p3").get.asInstanceOf[Int]
}
To get rid of the casting, you could have MyConfig wrap a non-lazy inner class that's lazy loaded by MyConfig.
class MyConfig(fileName:String) extends Config {
private class NonLazyConfig(val p1:String, p2:String, p3:int) extends Config {
def param1 = p1
def param2 = p2
def param1 = p3
}
lazy val inner:Config = readConfig
// some dummy impl here, should read from a file
protected def readConfig:Config = {
return new NonLazyConfig("abc", "defgh", 43)
}
def param1:String = inner.param1
def param2:String = inner.param2
def param3:Int = inner.param3
}