loading external scala scripts into a scala file - scala

i originally made scripts with many functions on 2 individual scala worksheets. i got them working and now want to tie these individual scripts together by importing and using them into a third file. from what i have read you can not simply import external scripts you must first make them into a class and put them into a package. so i tried that but i still couldn't import it
i know this may be a bit basic for this site but im struggling to find much scala documentation.
i think my problem might span from a missunderstanding of how packages work. the picture below might help.
my program example
adder.scala
package adder
class adder {
def add_to_this(AA:Int):Int={
var BB = AA + 1;
return BB
}
}
build.scala
package builder
class build {
def make_numbers(){
var a = 0;
var b = 0;}
}
main.sc
import adder
import builder
object main {
adder.adder.add_to_this(10);
}
the errors i get are
object is not a member of package adder
object is not a member of package builder

Classes in scala slightly differ from classes in java. If you need something like singleton, you'll want to use object instead of class i.e.:
package com.example
object Main extends App {
object Hide{
object Adder{
def addToThis(AA:Int):Int = AA + 1
}
}
object Example{
import com.example.Main.Hide.Adder
def run(): Unit = println(Adder.addToThis(10))
}
Example.run()
}
Consider objects like packages/modules which are also regular values. You can import an object by its full path, i.e. com.example.Main.Hide.Adder you can also import contents of an object by adding .{addToThis}, or import anything from object by adding ._ after an object.
Note that classes, traits and case classes could not be used as objects, you can't do anything with it unless you have an instance - there are no static modifier.

Related

Scala: package object v.s. singleton object within a package

I want to group a set of similar functions in a library in scala.
Here are two approaches I have seen elsewhere. I want to understand the
differences between the two.
Singleton object defined in a package
// src/main/scala/com/example/toplevel/functions.scala
package com.example.toplevel
object functions {
def foo: Unit = { ... }
def bar: Unit = { ... }
}
Package object
// src/main/scala/com/example/toplevel/package.scala
package com.example.toplevel
package object functions {
def foo: Unit = { ... }
def bar: Unit = { ... }
}
Comparison
As far as I can tell, the first approach will require explicitly importing
the functions object whenever you want to use its functions. While the package object approach allows anything in the package functions to access those methods without importing them.
Ie, com.example.toplevel.functions.MyClass would have access to com.example.toplevel.functions.foo implicitly.
Is my understanding correct?
If there are no classes defined within com.example.toplevel.functions,
it seems the approaches would be equivalent, is this correct?
Ansered by terminally-chill in a comment:
yes, your understanding is correct. Anything defined in your package
object will be available without having to import. If you use an
object, you will have to import it even within the same package.

Is it possible to mock / override dependencies / imports in Scala?

I have some code looking like this:
package org.samidarko.actors
import org.samidarko.helpers.Lib
class Monitoring extends Actor {
override def receive: Receive = {
case Tick =>
Lib.sendNotification()
}
}
Is there a way to mock/stub Lib from ScalaTest like with proxyquire for nodejs?
I read that I could use dependency injection but I would rather not do that
Is my only alternative is to pass my lib as class parameter?
class Monitoring(lib: Lib) extends Actor {
Any advice to make it more testable? Thanks
EDIT:
Xavier Guihot's answer is an interesting approach of the problem but I choose to change the code for testing purpose.
I'm passing the Lib as parameter and I'm mocking with mockito, it makes the code easier to test and to maintain than shadowing the scope.
This answer only uses scalatest and doesn't impact the source code:
Basic solution:
Let's say you have this src class (the one you want to test and for which you want to mock the dependency):
package com.my.code
import com.lib.LibHelper
class MyClass() {
def myFunction(): String = LibHelper.help()
}
and this library dependency (which you want to mock / override when testing MyClass):
package com.lib
object LibHelper {
def help(): String = "hello world"
}
The idea is to create a class in your test folder which will override/shadow the library. The class will have the same name and the same package as the one you want to mock. In src/test/scala/com/external/lib, you can create LibHelper.scala which contains this code:
package com.lib
object LibHelper {
def help(): String = "hello world - overriden"
}
And this way you can test your code the usual way:
package com.my.code
import org.scalatest.FunSuite
class MyClassTest extends FunSuite {
test("my_test") {
assert(new MyClass().myFunction() === "hello world - overriden")
}
}
Improved solution which allows setting the behavior of the mock for each test:
Previous code is clear and simple but the mocked behavior of LibHelper is the same for all tests. And one might want to have a method of LibHelper produce different outputs. We can thus consider setting a mutable variable in the LibHelper and updating the variable before each test in order to set the desired behavior of LibHelper. (This only works if LibHelper is an object)
The shadowing LibHelper (the one in src/test/scala/com/external/lib) should be replaced with:
package com.lib
object LibHelper {
var testName = "test_1"
def help(): String =
testName match {
case "test_1" => "hello world - overriden - test 1"
case "test_2" => "hello world - overriden - test 2"
}
}
And the scalatest class should become:
package com.my.code
import com.lib.LibHelper
import org.scalatest.FunSuite
class MyClassTest extends FunSuite {
test("test_1") {
LibHelper.testName = "test_1"
assert(new MyClass().myFunction() === "hello world - overriden - test 1")
}
test("test_2") {
LibHelper.testName = "test_2"
assert(new MyClass().myFunction() === "hello world - overriden - test 2")
}
}
Very important precision, since we're using a global variable, it is compulsory to force scalatest to run test in sequence (not in parallel). The associated scalatest option (to be included in build.sbt) is:
parallelExecution in Test := false
Not a complete answer (as I don't know AOP very well), but to put you in the right direction, this is possible through Java lib called AspectJ:
https://blog.jayway.com/2007/02/16/static-mock-using-aspectj/
https://www.cakesolutions.net/teamblogs/2013/08/07/aspectj-with-akka-scala
Example in pseudocode (without going into details):
class Mock extends MockAspect {
#Pointcut("execution (* org.samidarko.helpers.Lib.sendNotification(..))")
def intercept() {...}
}
The low level basics of this approach are Dynamic Proxies: https://dzone.com/articles/java-dynamic-proxy. However, you can mock static methods too (maybe you'll have to add word static into the pattern).

Importing object without class

I'm trying to import an object from another .scala file that doesn't exist inside a class. I've found you can import a class like in here Scala, importing class. Is there a way to import an object without having a class around it?
Thanks
Importing a class and importing an object work the same in scala.
If you have a class
package com.package1
class MyClass{}
and an object
package com.package2
object MyObject{}
You import both the exact same way
package com.package3
import com.package1.MyClass
import com.package2.MyObject
import syntax is the same no matter what you are importing, whether it's an object, a class, a trait, a method, or a field
Yes, Scala can do exactly what you ask, and this is used frequently. Here is an example:
object Blah {
val x = 1
val y = "hello"
}
object Main extends App {
import Blah._
println(s"x=$x; y=$y")
}
Output is:
x=1; y=hello
You can also import members of a class instance, which blew my mind the first time I saw it.
If you are talking about companion objects, they are not defined inside a class, but after the class definition:
class AClass {
def sayHello() = {
println(AClass.Hello)
}
}
object AClass {
private val Hello = "hello"
}
You should have no problem importing it.

Pass parameters to singleton object in Scala

I've inherited a Scala project that has to be extended and instead of one monolithic monster it has to be split: some code needs to become a library that's used by all the other components and there's a few applications that may be included at some later point.
object Shared {
// vals, defs
}
=====
import Shared._
object Utils1 {
// vals, defs
}
=====
import Shared._
object Utils2 {
// vals, defs
}
=====
import Shared._
import Utils1._
class Class1(val ...) {
// stuff
}
=====
import Shared._
import Utils2._
class Class2(val ...) {
// more stuff
}
etc.
The problem is that Utils1 and Utils2 (and many other objects and classes) use the values from Shared (the singleton) and that Shared now has to be instantiated because one of the key things that happens in Shared is that the application name is set (through SparkConf) as well as reading of configuration files with database connection information.
The idea was to have a multi-project build where I can just pick which component needs to be recompiled and do it. Since the code in Utils is shared by pretty much all applications that currently exist and will come, it's nice to keep it together in one large project, at least so I thought. It's hard to publish to a local repository the code that's common because we have no local repository (yet) and getting permission to have one is difficult.
The Utils singletons are needed because they have to be static (because of Spark and serializability).
When I make Shared a proper class, then all the import statements will become useless and I have to pass an instance around, which means I cannot use singletons, even though I need those. It currently works because there is only a single application, so there really only one instance of Shared. In future, there will still be only one instance of Shared per application but there may be multiple applications/services defined in a single project.
I've looked into implicits, package objects, and dependency injection but I'm not sure that's the right road to go down on. Maybe I'm just not seeing what should be obvious.
Long story short, is there a neat way to give Shared a parameter or achieve what I hope to achieve?
Maybe as an option you can create SharedClass class and make Shared object extend it in each app with different constructor params:
class SharedClass(val param: String) { // vals ...
}
object Shared extends SharedClass("value")
One hacky way to do it, is to use a DynamicVariable and make it have a value, when Shared is initialized (i.e., the first time anything that refers to Shared fields or methods is called).
Consider the following:
/* File: Config.scala */
import scala.util.DynamicVariable
object Config {
val Cfg: DynamicVariable[String] = new DynamicVariable[String](null)
}
/**********************/
/* File: Shared.scala */
import Config._
object Shared {
final val Str: String = Cfg.value
def init(): Unit = { }
}
/*********************/
/* File: Utils.scala */
import Shared._
object Utils {
def Length: Int = Str.length
}
/********************/
/* File: Main.scala */
object Main extends App
{
// Make sure `Shared` is initialized with the desired value
Config.Cfg.withValue("foobar") {
Shared.init()
}
// Now Shared.Str is fixed for the duration of the program
println(Shared.Str) // prints: foobar
println(Utils.Length) // prints: 6
}
This setup is thread-safe, although not deterministic, of course. The following Main would randomly select one of the 3 strings, and print the selected string and its length 3 times:
import scala.util.Random
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
object Main extends App
{
def random(): Int = Random.nextInt(100)
def foo(param: String, delay: Int = random()): Future[Unit] = {
Future {
Thread.sleep(delay)
Config.Cfg.withValue(param) {
Shared.init()
}
println(Shared.Str)
println(Utils.Length)
}
}
foo("foo")
foo("foobar")
foo("foobarbaz")
Thread.sleep(1000)
}
object in Scala has "apply" method which you can use for this purpose. So, your Shared object will be like below
object Shared {
def apply(param1: String, param2: String) = ???
}
Now each client of Shared can pass different values.

How to handle different package names in different versions?

I have a 3rd party library with package foo.bar
I normally use it as:
import foo.bar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
The new version of the library has renamed the package from foo.bar to newfoo.newbar. I have now another version of my code with the slight change as follows:
import newfoo.newbar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
Notice that only the first import is different.
Is there any way I can keep the same version of my code and still switch between different versions of the 3rd party library as and when needed?
I need something like conditional imports, or an alternative way.
The other answer is on the right track but doesn't really get you all the way there. The most common way to do this kind of thing in Scala is to provide a base compatibility trait that has different implementations for each version. In my little abstracted library, for example, I have the following MacrosCompat for Scala 2.10:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.Context
def resultType(c: Context)(tpe: c.Type)(implicit
tag: ClassTag[c.universe.MethodType]
): c.Type = {
import c.universe.MethodType
tpe match {
case MethodType(_, res) => resultType(c)(res)
case other => other
}
}
}
And this one for 2.11:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.whitebox.Context
def resultType(c: Context)(tpe: c.Type): c.Type = tpe.finalResultType
}
And then my classes, traits, and objects that use the macro reflection API can just extend MacrosCompat and they'll get the appropriate Context and an implementation of resultType for the version we're currently building (this is necessary because of changes to the macros API between 2.10 and 2.11).
(This isn't originally my idea or pattern, but I'm not sure who to attribute it to. Probably Eugene Burmako?)
If you're using SBT, there's special support for version-specific source trees—you can have a src/main/scala for your shared code and e.g. src/main/scala-2.10 and src/main/scala-2.11 directories for version-specific code, and SBT will take care of the rest.
You can try to use type aliases:
package myfoo
object mybar {
type MyBaz = newfoo.newbar.Baz
// val MyBaz = newfoo.newbar.Baz // if Baz is a case class/object, then it needs to be aliased twice - as a type and as a value
}
And then you may simply import myfoo.mybar._ and replace the object mybar to switch to different version of the library.