Using Scala Reflection for a plugin architecture - scala

I'm building an application Scala that uses a plugin architecture, and I'm trying to load plugins in at runtime. Currently, my plugin loader code is:
import org.clapper.classutil.ClassFinder
object PluginManager extends PluginManager {
val plugins = new mutable.HashMap[String, Plugin]()
val pluginFolder = new File("plugins")
def init(): Unit = {
val pluginJars = pluginFolder.listFiles.filter(_.getName.endsWith(".jar"))
val classpath = List(new File(".")) ++ pluginJars
val finder = ClassFinder(classpath)
val classes = finder.getClasses()
val classMap = ClassFinder.classInfoMap(classes.iterator)
val pluginsToLoad = ClassFinder.concreteSubclasses("org.narrativeandplay.hypedyn.plugins.Plugin", classMap)
val loader = new URLClassLoader(pluginJars.map({ f => new URL(s"file:${f.getAbsolutePath}") }), ClassLoader.getSystemClassLoader)
pluginsToLoad.foreach {
pluginString =>
val plugin = loader.loadClass(pluginString.name).newInstance().asInstanceOf[Plugin]
plugins += plugin.name -> plugin
}
}
}
(based on https://vikashazrati.wordpress.com/2011/09/15/building-a-plugin-based-architecture-in-scala/).
I had to use the URLClassLoader because my plugin JARs weren't on the classpath when the application started
I was wondering if it's possible to use the Scala Reflection API to replace my use of the URLClassLoader, and if so, how should I do so?

In order for the runtime reflection part of the Scala reflection API to work, you will anyway need to instantiate a classloader (and then do scala.reflect.runtime.universe.runtimeMirror(classloader)), so I think that it's not going to simplify matters in this situation.

Related

main class not found in spark scala program

//package com.jsonReader
import play.api.libs.json._
import play.api.libs.json._
import play.api.libs.json.Reads._
import play.api.libs.json.Json.JsValueWrapper
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.SQLContext
//import org.apache.spark.implicits._
//import sqlContext.implicits._
object json {
def flatten(js: JsValue, prefix: String = ""): JsObject = js.as[JsObject].fields.foldLeft(Json.obj()) {
case (acc, (k, v: JsObject)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
acc.deepMerge(flatten(v, nk))
}
case (acc, (k, v: JsArray)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
val arr = flattenArray(v, nk).foldLeft(Json.obj())(_++_)
acc.deepMerge(arr)
}
case (acc, (k, v)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
acc + (nk -> v)
}
}
def flattenArray(a: JsArray, k: String = ""): Seq[JsObject] = {
flattenSeq(a.value.zipWithIndex.map {
case (o: JsObject, i: Int) =>
flatten(o, s"$k[$i]")
case (o: JsArray, i: Int) =>
flattenArray(o, s"$k[$i]")
case a =>
Json.obj(s"$k[${a._2}]" -> a._1)
})
}
def flattenSeq(s: Seq[Any], b: Seq[JsObject] = Seq()): Seq[JsObject] = {
s.foldLeft[Seq[JsObject]](b){
case (acc, v: JsObject) =>
acc:+v
case (acc, v: Seq[Any]) =>
flattenSeq(v, acc)
}
}
def main(args: Array[String]) {
val appName = "Stream example 1"
val conf = new SparkConf().setAppName(appName).setMaster("local[*]")
//val spark = new SparkContext(conf)
val sc = new SparkContext(conf)
//val sqlContext = new SQLContext(sc)
val sqlContext=new SQLContext(sc);
//val spark=sqlContext.sparkSession
val spark = SparkSession.builder().appName("json Reader")
val df = sqlContext.read.json("C://Users//ashda//Desktop//test.json")
val set = df.select($"user",$"status",$"reason",explode($"dates")).show()
val read = flatten(df)
read.printSchema()
df.show()
}
}
I'm trying to use this code to flatten a higly nested json. For this I created a project and converted it to a maven project. I edited the pom.xml and included the libraries I needed but when I run program it says "Error: Could not find or load main class".
I tried converting the code to sbt project and then run but I get the same error. I tried packaging the code and run through spark-submit which gives me same error. Please let me know what am I missing here. I have tried I could for this.
Thanks
Hard to say, but maybe you have many classes that qualify as main so the build tool does not know which one to choose. Maybe try to clean the project first sbt clean.
Anyway in scala the preferred way to define a main class is to extend the App -trait.
object SomeApp extends App
Then the whole object body will become your main method.
You can also define in your build.sbt the main class. This is necessary if you have many objects that extend the App -trait.
mainClass in (Compile, run) := Some("io.example.SomeApp")
I am answering this question for sbt configurations. I also got the same issues which I resolved recently and made some basic mistakes which I would like you to note :
1. Configure your sbt file
go to build.sbt file and see that the scala version you are using is compatible with spark.As per version 2.4.0 of spark https://spark.apache.org/docs/latest/ ,scala version required is 2.11.x and not 2.12.x . So, even though your IDE (Eclipse/IntelliJ) shows the latest version of scala or the version you downloaded, change it to compatible version. Also, include this line of code
libraryDependencies += "org.scala-lang" % "scala-library" % "2.11.6"
2.11.x is your scala version
2. File Hierarchy
Make sure your Scala file is under /src/main/scala package only
3. Terminal
If your IDE allows you to launch terminal within it, launch it(IntelliJ allows, Not sure of Eclipse or any other) OR Go to terminal and change directory to your project directory
then run :
sbt clean
This will clear any libraries loaded previously or folders created after compilation.
sbt package
This will pack your files into a single jar file under target/scala-/ package
Then submit to spark :
spark-submit target/scala-<version>/<.jar file> --class "<ClassName>(In your case , com.jsonReader.json)" --jars target/scala-<version>/<.jar file> --master local[*]
Note here that -- if specified in a program isnt required here

How to test actors with components injected by Guice in Play! scala 2.5

I'm using Guice to inject components inside an actor as it is explained in the Play! Scala 2.5 documentation.
In my application, I inject unshortLinksFactory: UnshortLinks.Factory in my classes and I create a new actor like this:
val unshortLinksActor = actorSystem.actorOf(Props(unshortLinksFactory(ws)))
The problem is that I cannot inject components in my test class (can I?) otherwise the test are not started. (Please note that I use Scalatest.)
How can I create the actor in my tests? It's fine if I can create it like:
val unshortLinksActor = system.actorOf(Props(unshortLinksFactory(ws)))
but the best would be to be able to create it with TestActorRef from Akka.testKit in order to have access to the underlyingActor.
What I do in order to test it is:
I extends the test class with TestKit(ActorSystem("testSystem")).
Then I create the Props like this:
lazy val unshortLinkFactoryProps = Props(unshortLinkFactory(
dbConfigProvider = dbConfProvider)
Here dbConfProvider is created like this but could also be mocked:
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val dbConfProvider = injector.instanceOf[DatabaseConfigProvider]
Finally I can have an actorRef like this:
val actorRef = TestActorRef[UnshortLinksActor](unshortLinksFactoryProps)
And I can access the methods inside my actor with actorRef.underlyingActor.

How to override guice modules in Playframework unit tests using ScalaTest

I want to write functional test for my controller in PlayFramework. To do that I want to mock implementation of some classes.
I found nice example of how to do that using spec2 here: http://www.innovaedge.com/2015/07/01/how-to-use-mocks-in-injected-objects-with-guiceplayscala/
But I'm using scala test with OneAppPerSuite trait that uses FakeApplication. Here are documentation:
https://www.playframework.com/documentation/2.4.x/ScalaFunctionalTestingWithScalaTest
Problem is that i cannot found a way to intercept into GuiceApplicationBuilder and override some bindings with mock implementation.
Here are FakeApplication implementation from play.api.test:
case class FakeApplication(
override val path: java.io.File = new java.io.File("."),
override val classloader: ClassLoader = classOf[FakeApplication].getClassLoader,
additionalPlugins: Seq[String] = Nil,
withoutPlugins: Seq[String] = Nil,
additionalConfiguration: Map[String, _ <: Any] = Map.empty,
withGlobal: Option[play.api.GlobalSettings] = None,
withRoutes: PartialFunction[(String, String), Handler] = PartialFunction.empty) extends Application {
private val app: Application = new GuiceApplicationBuilder()
.in(Environment(path, classloader, Mode.Test))
.global(withGlobal.orNull)
.configure(additionalConfiguration)
.bindings(
bind[FakePluginsConfig] to FakePluginsConfig(additionalPlugins, withoutPlugins),
bind[FakeRouterConfig] to FakeRouterConfig(withRoutes))
.overrides(
bind[Plugins].toProvider[FakePluginsProvider],
bind[Router].toProvider[FakeRouterProvider])
.build
So there is no way for me to intercept into GuiceApplicationBuilder and override bindings.
I'm new to playframework so sorry if question looks a bit silly.
Thanks!
Take a look at GuiceOneAppPerTest. Here is an example (Play 2.8, scala 2.13):
abstract class MyBaseSpec extends PlaySpec with GuiceOneAppPerTest with Results with Matchers with MockFactory {
def overrideModules: Seq[GuiceableModule] = Nil
override def newAppForTest(testData: TestData): Application = {
GuiceApplicationBuilder()
.overrides(bind[ControllerComponents].toInstance(Helpers.stubControllerComponents()))
.overrides(overrideModules: _*)
.build()
}
}
class OrderServiceSpec extends MyBaseSpec {
val ordersService: OrdersService = mock[OrdersService]
val usersService: UsersService = mock[UsersService]
override def overrideModules = Seq(
bind[OrdersService].toInstance(ordersService),
bind[UsersService].toInstance(usersService),
)
// tests
}
You are probably using an older version of ScalaTestPlus, which didn't support overriding FakeApplication with Application. In Play docs(Play 2.4) the library version is "1.4.0-M3" but it should be "1.4.0".

Create custom Arbitrary generator for testing java code from ScalaTest ScalaCheck

Is it possible to create a custom Arbitrary Generator in a ScalaTest (which mixins Checkers for ScalaCheck property) which is testing Java code? for e.g. following are the required steps for each test within forAll
val fund = new Fund()
val fundAccount = new Account(Account.RETIREMENT)
val consumer = new Consumer("John")
.createAccount(fundAccount)
fund.addConsumer(consumer)
fundAccount.deposit(amount)
above is a prep code before asserting results etc.
You sure can. This should get you started.
import org.scalacheck._
import Arbitrary._
import Prop._
case class Consumer(name:String)
object ConsumerChecks extends Properties("Consumer") {
lazy val genConsumer:Gen[Consumer] = for {
name <- arbitrary[String]
} yield Consumer(name)
implicit lazy val arbConsumer:Arbitrary[Consumer] = Arbitrary(genConsumer)
property("some prop") = forAll { c:Consumer =>
// Check stuff
true
}
}

Scala Presentation Compiler

Hi I've been trying to get the presentation compiler to work but I'm getting the following error. Any help regarding this would be appreciated. I've already seen other questions and few projects where it has been implemented but everyone uses the Global.Run which is not being recognized in the REPL. This the code and the error below it. I've installed scala 2.10.3 in windows 8.1
import scala.tools.nsc.{Global,Settings}
import scala.tools.nsc.reporters._
object Test {
def main (args: Array[String]) {
val settings = new Settings;
val global = new Global(settings,new ConsoleReporter(settings));
val compiler = global.Run;
}
}
The error is
Sample.scala:8: error: value Run is not a member of scala.tools.nsc.Global
Try this:
import scala.tools.nsc.{Global,Settings}
import scala.tools.nsc.reporters._
object Test {
def main (args: Array[String]) {
val settings = new Settings
val global = new Global(settings,new ConsoleReporter(settings))
val compiler = new global.Run
}
}
Notice new Run instead of Run. There is no companion object for class Run. Maybe it was there before in earlier scala versions. Checked on Scala v2.10.3. Works in REPL.