Use function without explicit import in Scala - scala

I like Scala's sys.error function - but I want to distinguish two cases: internal errors (e.g. database problem) and user errors (invalid input).
I tried extending Scala - but it doesn't seem to work:
package scala
class UserException(msg: String) extends RuntimeException(msg)
package object err {
def internal(message: String): Nothing =
sys.error(message)
def usr(message: String): Nothing =
throw new UserException(message)
}
How should I define err.usr() to be able to use it without an explicit import?

You can't, only scala.Predef is imported by default and it's not user extensible in any useful way.

You could put these definitions on the package object of your package hierarchy. Then, everything on that package will see them without import.

Related

Is it possible to automatically load an implicit def if included as a dependency (no importing)

I'm working on a commons library that includes a config library (https://github.com/kxbmap/configs).
This config library uses "kebab-case" when parsing configuration files by default and it can be overridden by an implicit def in scope.
However, I don't want to force that on the users of my commons library when they get access to the config library transitively.
So without me forcing users to import this implicit, like:
import CommonsConfig._
can I somehow override the naming strategy via an implicit that gets into scope by only including my commons library on the classpath. I'm guessing no but I just have to ask :)
So if not, is someone aware of another approach?
kxbmap/configs isn't that well documented to explain this.
Thanks!
Implicits work in compile time, so they cannot get magically present if something is included and then disappear if it isn't.
The closest thing would be something like:
main library
package my.library
// classes, traits, objects but no package object
extension
package my
package object library {
// implicits
}
user's code
import my.library._
however that would only work if there were no package object in main library, only one extension library could pull off this trick at once (Scala doesn't like more than one package object) and user would have to import everything available with a package, always.
In theory you could create a wrapper around all you deps, with your own configs:
final case class MyLibConfig(configsCfg: DerivationConfig)
object MyLibConfig {
implicit val default: MyLibConfig = ...
}
and then derive using this wrapper
def parseThings(args...)(implicit myLibConfig: MyLibConfig) = {
implicit val config: DerivationConfig = myLibConfig.config
// derivation
}
but in practice it would not work (parseThings would have to already know the target type or would need to have the already derived implicits passed). Unless you are up to writing your own derivation methods... avoid it.
Some way of making user just import all relevant stuff is the most maintainable strategy. E.g. you could pull off the same thing authors did and add type aliases for all types that you use, do the same for companion objects and finally put some implicits there:
package my
package object library {
type MyType = some.library.Type
val MyType = some.library.Type
implicit val derivationConfig: DerivationConfig = ...
}

How to use JSImport when writing scalajs facade for javascript modules

I have written a facade using JSImport, and it works. Unfortunately, I arrived at the solution through trial and error, and I don't fully understand why this particular solution works but others I tried did not.
Background: I'm starting with a working project, built with sbt, which is a single page application that implements the client side code with scala.js and the server side with scala and the Play framework. The javascript libraries were packaged with web jars and bundled into the client js file using the sbt jsDependencies variable. I wanted to implement some new features which required a library up rev, which then required an up rev of some javascript libs which were only available in npm format. So now I am including all the javascript dependencies for the client app using npmDependencies with the scalajs-bundler plugin. This broke some of the scalajs facades leading to my question.
I'll use the facade to log4javascript as an example for this question.
The variable log4javascript is the top level object used to access the rest of the api.
When the js libs were included as web jars, this is how the facade to log4javascript was implemented:
#js.native
#js.annotation.JSGlobalScope
object Log4JavaScript extends js.Object {
val log4javascript:Log4JavaScript = js.native
}
After the change to npm:
import scala.scalajs.js.annotation.JSImport.Namespace
#JSImport("log4javascript", Namespace)
#js.native
object Log4JavaScript extends js.Object {
def resetConfiguration(): Unit = js.native
def getLogger(name:js.UndefOr[String]): JSLogger = js.native
...
}
Following the scala.js docs for writing importing modules I expected the object name (Log4JavaScript in this case) would have to match the exported symbol name in order for the binding to work. However, the top level symbol in log4javascript.js is log4javascript. After experimenting, it seems the scala object name makes no difference for the binding. It binds correctly no matter what I name the scala top level object.
Can someone explain what relationship exists, if any, between the scala object/class/def/val names and the names in the javascript module when using the 'Namespace' arg to JSImport?
According to the scala.js docs, it seems I should be able to provide the actual name of the js object (I also tried "Log4JavaScript")
#JSImport("log4javascript", "log4javascript")
#js.native
object SomeOtherName extends js.Object {
def resetConfiguration(): Unit = js.native
def getLogger(name:js.UndefOr[String]): JSLogger = js.native
...
}
However, this fails to bind. I will get a runtime error when I try to access any of the member functions.
Log4JavaScript.resetConfiguration()
Uncaught TypeError: Cannot read property 'resetConfiguration' of undefined
Can someone explain why this doesn't work?
log4javascript also defines some classes inside the scope of log4javascript. When the lib was included as a web jar the definition looked like:
#js.native
#JSGlobal("log4javascript.AjaxAppender")
class AjaxAppender(url:String) extends Appender {
def addHeader(header:String, value:String):Unit = js.native
}
After switching to npm I had to put the class definition inside the top level object:
#js.native
trait Appender extends js.Object {
...
}
#JSImport("log4javascript", "log4javascript")
#js.native
object Log4JavaScript extends js.Object {
...
class AjaxAppender(url: String) extends Appender {
def addHeader(name: String, value: String): Unit = js.native
}
...
}
This seems sensible, but from the scala.js docs it seems like it should have been possible to define it this way outside of the top level object
#JSImport("log4javascript", "log4javascript.AjaxAppender")
#js.native
class AjaxAppender(url: String) extends Appender {
def addHeader(name: String, value: String): Unit = js.native
}
However, this also fails to bind. Could someone explain the correct way to define the class as above? Or is the definition nested inside the Log4JavaScript object the only correct way to do it?
Can someone explain what relationship exists, if any, between the scala object/class/def/val names and the names in the javascript module when using the 'Namespace' arg to JSImport?
This is explained in this part of the Scala.js documentation. The name of the Scala object defining the facade does not matter. What matter are the parameters of the #JSImport annotation. The first one indicates which module to import from, and the second one indicates what to import.
In your case, the log4javascript module is in the log4javascript.js file, in the log4javascript package directory. So, your first parameter should be:
#JSImport("log4javascript/log4javascript.js", ...)
object Log4JavaScript ...
However, log4javascript is defined as an npm module whose main file refers to the log4javascript.js file. This means that you can just use the package directory name:
#JSImport("log4javascript", ...)
object Log4JavaScript ...
(See this article for more information on how NodeJS does resolve modules)
The second parameter of the #JSImport annotation indicates what to import. In your case, you want to import the whole module, not just a member of it, so you want to use Namespace:
#JSImport("log4javascript", Namespace)
object Log4JavaScript ...
This corresponds to the following EcmaScript import statement:
import * as Log4JavaScript from 'log4javascript'
Note that, although the Scala object name (Log4JavaScript, here) does not matter, the names of its members does matter, as explained in this part of the Scala.js documentation.
According to the scala.js docs, it seems I should be able to provide the actual name of the js object (I also tried "Log4JavaScript")
#JSImport("log4javascript", "log4javascript")
...
However, this fails to bind. I will get a runtime error when I try to access any of the member functions.
When you write that, you try to access to the log4javascript member of the log4javascript module. But that module does not have such a member.
it should have been possible to define it this way outside of the top level object
#JSImport("log4javascript", "log4javascript.AjaxAppender")
...
However, this also fails to bind.
Again, this means “import the log4javascript.AjaxAppender member from the log4javascript module”, but that module does not have such a member. The following should work:
#JSImport("log4javascript", "AjaxAppender")

How could a "global implicit class" be defined in Scala?

Considering that a implicit class "must be defined inside of another trait/class/object"1, how can a implicit conversion be defined globally?
The case is that I'd like to add a method to all Strings (or Lists) in my application, or at least to several packages of it.
One cannot add anything to the "global" scope, neither in Java, nor in Scala.
However, in Scala one can define package objects, which can contain methods that are used all over the package, and can be easily imported by the user.
This looks something like this: in the directory foo/bar/baz one creates a file called package.scala with the following content:
package foo.bar
package object baz {
implicit def incrediblyUsefulConversion(s: String) = ...
}
The user then can do the following in his code to activate the conversion:
import foo.bar.baz._
or maybe
import foo.bar.baz.incrediblyUsefulConversion
Of course, you can also use your own code in other packages, just like any other user.

How to define a global function in scala?

I'm using play framework, I want to define a global function, How can I do it?
First I define the function in SomeFunc.scala and import it to every file which I will use it.
Is it possible to direct use it like println without import SomeFunc.scala
println is defined in the object scala.Predef. The members of which is always in scope, there is no way you can add to that, but as the question linked to by senia says you can achieve sort of the same by defining a method in a package object which will then be available inside code in that package.
Another solution that some libraries uses is to provide an Imports object with aliases and shortcuts just like Predef, but that you have to explicitly import with a wildcard. For example nscala-time does this:
import com.github.nscala_time.time.Implicits._
Yes it is possible, but only global in the same package, not absolute global.
package com
package object myproject {
def myGlobalFunc(..) = ...
}
Then you use it like this:
package com.myproject
object HelloWorld {
def main(args: Array[String]) {
myGlobalFunc(...)
}
}

In Scala, how can I define a companion object for a class defined in Java?

I'd like to add implicit conversions to Java classes generated by a modeling tool. So I want to add them to the companion object of those classes, so that the compiler automatically finds them. But I cannot add them in a separate file, because the companion has to be defined in the same file. Is there anything I can do about this?
Of course, I can define all my implicit conversions in another object and then bring it into scope, but this requires an extra import. Any other solution?
You can define your own companion object of course, which I often do in my own project-specific Predef-like arrangement. For example:
object domain {
type TimeUnit = java.util.concurrent.TimeUnit
object TimeUnit {
def valueOf(s : String) = java.util.concurrent.TimeUnit.valueOf(str)
val Millis = java.util.concurrent.TimeUnit.MILLISECONDS
//etc
}
Then this can be used:
import my.domain._
val tu : TimeUnit = TimeUnit.valueOf("MILLISECONDS")
But your domain.TimeUnit is a module (i.e. scala object)
With the Scala compiler as it stands now there is no way to define companion objects other than by putting them in the same file. The best you can do is a non-companion object with the same package and name and an extra import.
If you can think of a good way to create post-hoc companionship without breaking assumptions about encapsulation please come post on http://groups.google.com/group/scala-debate because it would clearly be a very useful feature.