I don't know if a solution exists but it would be highly desirable.
I'm making a Scala Applet, and I want the main Applet class to be a singleton so it can be accessed elsewhere in the applet, sort of like:
object App extends Applet {
def init {
// do init here
}
}
Instead I have to make the App class a normal instantiatable class otherwise it complains because the contructor is private. So the ugly hack I have is to go:
object A {
var pp: App = null
}
class App extends Applet {
A.pp = this
def init {
// do init here
}
}
I really hate this, and is one of the reasons I don't like making applets in Scala right now.
Any better solution? It would be nice...
edit -- I found a pretty decent hack solution using implicit conversion. Declare your Applet class as a normal class, and then add:
class Appable {}
object App extends Appable {
implicit def appable2App(a:Appable) = inst
var inst: App = null
}
Then just set the instance variable when the Applet is created, and you can access everything as if it were a singleton.
What you're asking for is counter to the applet execution model. Specifically, there are no guarantees that you won't have multiple copies of the applet running in the same JVM. The default is to run every applet in the same JVM (including multiple copies of the same applet if necessary). See e.g. Sun's tutorial.
That's why applets are specified the way that they are instead of with a static public void method. Can't the rest of your code take the applet as a parameter?
Related
I have a complex project which reads configurations from a DB through the object ConfigAccessor which implements two basic APIs: getConfig(name: String) and storeConfig(c: Config).
Due to how the project is currently designed, almost every component needs to use the ConfigAccessor to talk with the DB. Thus, being this component an object it is easy to just import it and call its static methods.
Now I am trying to build some unit tests for the project in which the configurations are stored in a in-memory hashMap. So, first of all I decoupled the config accessor logic from its storage (using the cake pattern). In this way I can define my own ConfigDbComponent while testing
class ConfigAccessor {
this: ConfigDbComponent =>
...
The "problem" is that now ConfigAccessor is a class, which means I have to instantiate it at the beginning of my application and pass it everywhere to whoever needs it. The first way I can think of for passing this instance around would be through other components constructors. This would become quite verbose (adding a parameter to every constructor in the project).
What do you suggest me to do? Is there a way to use some design pattern to overcome this verbosity or some external mocking library would be more suitable for this?
Yes, the "right" way is passing it in constructors. You can reduce verbosity by providing a default argument:
class Foo(config: ConfigAccessor = ConfigAccessor) { ... }
There are some "dependency injection" frameworks, like guice or spring, built around this, but I won't go there, because I am not a fan.
You could also continue utilizing the cake pattern:
trait Configuration {
def config: ConfigAccessor
}
trait Foo { self: Configuration => ... }
class FooProd extends Foo with ProConfig
class FooTest extends Foo with TestConfig
Alternatively, use the "static setter". It minimizes changes to existing code, but requires mutable state, which is really frowned upon in scala:
object Config extends ConfigAccessor {
#volatile private var accessor: ConfigAccessor = _
def configurate(cfg: ConfigAccessor) = synchronized {
val old = accessor
accessor = cfg
old
}
def getConfig(c: String) = Option(accessor).fold(
throw new IllegalStateException("Not configurated!")
)(_.getConfig(c))
You can retain a global ConfigAccessor and allow selectable accessors like this:
object ConfigAccessor {
private lazy val accessor = GetConfigAccessor()
def getConfig(name: String) = accessor.getConfig(name)
...
}
For production builds you can put logic in GetConfigAccessor to select the appropriate accessor based on some global config such as typesafe config.
For unit testing you can have a different version of GetConfigAccessor for different test builds which return the appropriate test implementation.
Making this value lazy allows you to control the order of initialisation and if necessary do some non-functional mutable stuff in the initialisation code before creating the components.
Update following comments
The production code would have an implementation of GetConfigAccessor something like this:
object GetConfigAccessor {
private val useAws = System.getProperties.getProperty("accessor.aws") == "true"
def apply(): ConfigAccessor =
if (useAws) {
return new AwsConfigAccessor
} else {
return new PostgresConfigAccessor
}
}
Both AwsConfigAccessor and PostgresConfigAccessor would have their own unit tests to prove that they conform to the correct behaviour. The appropriate accessor can be selected at runtime by setting the appropriate system property.
For unit testing there would be a simpler implementation of GetConfigAccessor, something like this:
def GetConfigAccessor() = new MockConfigAccessor
Unit testing is done within a unit testing framework which contains a number of libraries and mock objects that are not part of the production code. These are built separately and are not compiled into the final product. So this version of GetConfigAccessor would be part of that unit testing code and would not be part of the final product.
Having said all that, I would only use this model for reading static configuration data because that keeps the code functional. The ConfigAccessor is just a convenient way to access global constants without having them passed down in the constructor.
If you are also writing data then this is more like a real DB than a configuration. In that case I would create custom accessors for each component that give access to different parts of the DB. That way it is clear which parts of the data are updated by each component. These accessors would be passed down to the component and can then be unit tested with the appropriate mock implementation as normal.
You may need to partition your data into static config and dynamic config and handle them separately.
I used to get the application.conf variable in Play 2.4.x with Play.current.configuration.getString('NAME_HERE'), and it was working good in class, object and companion object too.
Now, I'm using Play 2.5.4 with Scala in a new project, and I won't use this Play.current, because it's deprecated, but there is an alternative using DI, like this :
class HomeController #Inject() (configuration: play.api.Configuration) extends Controller {
def config = Action {
Ok(configuration.underlying.getString("db.driver"))
}
}
This DI Injection works like a charm in class, but in this project, I need to get the variable db.driver in a object? And as far I know, with an object I can't use DI.
Maybe using Guice would help?
You can use #Singleton annotated class instead of object
trait Foo {}
#Singleton
class FooImpl #Inject()(configuration: play.api.Configuration)) extends Foo {
//do whatever you want
}
#Singleton makes the class singleton.It feels bit awkward because Scala itself natively have syntax object to create a singleton, But this is the easiest and probably best solution to DI into a singleton.
You also may create the singleton eagerly like the code below.
bind(classOf[Foo]).to(classOf[FooImpl])asEagerSingleton()
for more detail Info, You can look up Google Guice Wiki and Playframework site
EDIT
How you call it is exactly the same as how you DI in Playframework2.5.
class BarController #Inject()(foo: Foo) extends Controller {
//Do whatever you want with Foo
}
Guice basically generates new instance every time you DI, Once you put #Singleton, Guice use only one instance instead.
DI is for anti-high coupling.So when you want to use a class you defined from another class,You need to DI otherwise the classes are highly coupled which end up making it harder to code your unit test.
FYI, You can use them outside of Play with this technique.
Create an Instance of class which does DI via Playframework Guice Independently in Scala
Have you tried
import com.typesafe.config.ConfigFactory
val myConfig = ConfigFactory.load().getString("myConfig.key")
Above approach doesn't require you to convert your object to singleton class.
You can do
Play.current.configuration
however that will (probably) no longer be possible with Play 2.6.
Ideally, however, you would pass the configuration in as a parameter to that method of the object or, use a class instead of an object.
What I somtimes do to migrate 'from object to class':
class MyComponent #Inject() (config: Configuration) {
// here goes everything nice
def doStuff = ???
}
object MyComponent {
#deprecated("Inject MyComponent")
def doStuff = {
val instance = Play.current.injector.instanceOf[MyComponent]
instance.doStuff
}
}
This way, you're not breaking existing code while all users of your methods can slowly migrate to using classes.
I have a Logger class that logs events in my application. While I only need one instance of the logger in this application, I want this class to be reusable, so I don't want to make it a singleton and couple it with my specific needs for this application.
I want to be able to access this Logger instance from anywhere in the application without having to create a new one every time or pass it around to every class that might need to log something. What I currently do is have an ApplicationUtils singleton that I use as the point of access for the application's Logger:
object ApplicationUtils {
lazy val log : Logger = new Logger()
}
Then I have a Loggable trait that I add to classes that need the Logger:
trait Loggable {
protected[this] lazy val log = ApplicationUtils.log
}
Is this a valid approach for what I am trying to accomplish? It feels a little hack-y. Is there a better approach I could be using? I'm pretty new to Scala.
Be careful when putting functionality in objects. That functionality is easily testable, but if you need to test clients of that code to make sure they interact with it correctly (via mocks and spies), you're stuck 'cause objects compile to final classes and thus cannot be mocked.
Instead, use this pattern:
trait T { /* code goes here */ }
object T extends T /* pass this to client code from main sources */
Now you can create Mockito mocks / spies for trait T in your test code, pass that in and confirm that the interactions of the code under test with the trait T code are what they should be.
If you have code that's a client of T and whose interactions with it don't require testing, you can directly reference object T.
To address what you're trying to do (rather than what you're asking), take a look at TypeSafe's scalalogging package. It provides a Logging trait that you can use like so:
class MyClass extends Logging {
logger.debug("This is very convenient ;-)")
}
It's a macro-based wrapper for SLF4J, so something like logger.debug(...) gets compiled as if (logger.isDebugEnabled) logger.debug(...).
I have used Scala but I worked with objects but now I have to use classes. I am creating a scala class in a package, however, when I try to run it, it asks me to choose between Scala Applet or Scala Application and none of them work. Anybody has an idea on how to fix this problem? And do you declare a main method inside a class (like in objects) ?
For running your application you need to have App extended object or an object with main method.
It is nice that you are using classes. But the entry point of your code must be an object with main method or ( App extended )
So use your classes to build the application but start your application from an object.
case class Test(val x: Int);
object Main extends App {
override def main(args: Array[String]): Unit = {
val x = Test(5);
println("this is the main method");
println(x);
}
}
In the JVM world, the main method in the entry point to an application.
In general, the main method is where you start your various components, the threads/executors that run them etc.
UPDATE:
I have somewhat resolved the issue. Just in case if anyone runs in the same problem, here is the simplest solution: Looking at the MTApplcation source code, I have discovered that the initialize() method can be overloaded, taking a String parameter for the name of the class to instantiate. So if I create a separate class that extends MTApplication and pass it's name there, everything works correctly.
END OF UPDATE
I have a situation in Scala while trying to use a java library (MT4j, which is based on Processing). The library wants to instantiate the main class of the app (the caller-class):
Class<?> c = Thread.currentThread().getContextClassLoader().loadClass(name);
applet = (PApplet) c.newInstance();
So as to refer it later in it's works.
However, it fails because, I guess, the main Scala class is not a class, but an object and due to library structure, it is necessary to call a static method initialize() of the main library class MTApplication. In Java static fields are located in classes, but in Scala - in objects. So it is impossible to instantiate an object and the library fails. In contrast to MT4j, Processing itself makes no calls to static methods on startup and successfully passes that phase.
If I just create a companion class, everything works fine except that the companion class does not get its fields initialized because the static initialize() method is called in companion object, the class instance just gets dead-born and the library becomes unusable.
At least that is how I understand this problem.
I get this error:
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class processing.core.PApplet can not access a member of class main.Main$ with modifiers "private"
at processing.core.PApplet.runSketch(PApplet.java:9103)
at processing.core.PApplet.main(PApplet.java:9292)
at org.mt4j.MTApplication.initialize(MTApplication.java:311)
at org.mt4j.MTApplication.initialize(MTApplication.java:263)
at org.mt4j.MTApplication.initialize(MTApplication.java:254)
at main.Main$.main(Main.scala:26)
at main.Main.main(Main.scala)
It is hard for me to explain also because I do not fully understand what is going on here. But anyone who has these libs can reproduce the situation in a couple of minutes, trying to launch the main class.
The abstract startUp() method which should be implemented to start the app, makes everything look even more sad. It initializes the object, but what the library tries to work with is an instance of the companion class which does not get initialized because in Scala the method belongs to the object.
My code:
object Main extends MTApplication {
def main(args: Array[String]) {
MTApplication.initialize()
new Main().startUp()
}
//this method is abstarct so it MUST be implemented,
override def startUp(){
}
}
class Main extends MTApplication {
override def startUp(){
//startup here
}
}
I am sorry if my explanations are vague, I just do not get it all completely. Probably to understand it is easier to repeat the experiment with MT4j library with Processing source code instead of the pre-linked 'core.jar' there to see what is happening inside. Doeas anyone have ideas on any workaround here?
Problem solved. Here is the solution:
object Main {
var current: MainC = _
def main(args: Array[String]) {
MTApplication.initialize("org.mttablescreen.main.MainC")
}
}
class MainC extends MTApplication {
//cons
Main.current = this
//cons ends
override def startUp(){
prepare
}
def prepare () {...}
}