How to export properties of shared case classes - scala.js

I am trying to share a case class between server and client. I am using upickle on both ends. The objects and their data are nicely available on both ends.
shared class
case class Foo(var id : Long,var title: Description)
However i need to export the fields of the case class on the client side. I could add the #ExportAll annotation, but that means pulling in the scalajs libraries on the server project.
Is there perhaps a better way of exposing the members id and title to the javascript.
tx.,

The right way to export things to JavaScript is to use the #JSExportAll annotation. You cannot, and should not, pull the Scala.js libraries on the server project, though. For this use case, we have a dedicated JVM artifact, scalajs-stubs, which you can add to your JVM project like this:
libraryDependencies += "org.scala-js" %% "scalajs-stubs" % scalaJSVersion % "provided"
As a "provided" dependency, it will not be present at runtime. But it will allow you to compile the JVM project even though it references JSExportAll.
See also the ScalaDoc of scalajs-stubs.

Related

How to import Scalajs Facade with react-native? Error: $g.algoliasearch is not a function

I want to use facade Algoliasearch in my scalajs app, I imported it through sbt and but on UI I see $g.algoliasearch is not a function. I assume this is because of missing javascript libraries. I have included Algoliaseach-client-javascript dependencies in my package.json and yarn installed it.
Now I'm not sure when I use facade how I provide/link those dependencies. If someone could provide some example code that would be helpful
I am talking about any one of these facades https://github.com/DefinitelyScala/scala-js-algoliasearch
I have added jsDependencies like this
jsDependencies ++= Seq(
"org.webjars.bower" % "github-com-algolia-algoliasearch-client-javascript" % "3.18.1" / "3.18.1/reactnative.js"
)
I was hoping that above will declare something like this in my scalajs.output.ios file
var Algoliasearch = require(...) but I do not see any entry like thisand I'm still not sure how to import it to avoid following error:
$g.algoliasearch is not a function
Any help what I'm missing here please?
There are two options:
1) Simply include the JavaScript dependencies as separate <script> tags, as normal for JavaScript and HTML. Note that these must come before the <script> tag for the Scala.js application.
2) Use the jsDependencies mechanism to build a -jsDeps.js file, which collects all of your dependencies into one file, and include that with a <script> tag. (Again, before the application itself.)
Note that the facade library can provide the jsDependencies, and your application can use that, but I don't personally recommend that -- it can lead to versioning conflicts between dependencies. It tends to work best if your application simply loads the JavaScript itself.
If you are sure that your HTML file is already including the JavaScript, and you're getting this error, check to make sure that the JavaScript is being loaded before the Scala.js application runs.

How to shade a dependency for a non-executable Scala Library?

Spent a few hours trying to figure out how to do this. Over the course of it I have looked at a few seemingly promising questions but none of them seem to quite fit what I'm doing.
I've got three library jars, let's call them M, S, and H. Library M has things like:
case class MyModel(x: Int, s: String)
and then library S uses the play-json library, version 2.3.8, to provide implicit serializers for the classes defined by M
trait MyModelSerializer {
implicit val myModelFormt = Json.format[MyModel]
}
Which are then bundled up together into a convenience object for importing
package object implicits extends MyModelSerializer extends FooSerizlier // etc
That way, in Library H, when it performs HTTP calls to various services it just imports implicits from S and then I call Json.validate[MyModel] to get back the models I need from my web services. This is all well and dandy, but I'm working on an application that's running play 2.4 and when I included H into the project and tried to use it I ran up against:
java.lang.NoSuchMethodError: play.api.data.validation.ValidationError.<init>(Ljava/lang/String;Lscala/collection/Seq;)
Which I believe is being caused by play 2.4 using play-json version 2.4.6. Unfortunately, these are a minor version apart and this means that trying to just use the old library like:
// In build.sbt
"com.typesafe.play" %% "play-json" % "2.3.8" force()
Results in all the code in the app to fail to compile because I'm using things like JsError.toJson which weren't parts of play-json 2.3.8. I could change the 14 or so places trying to use that method, but given the exception before I have a feeling that even if I do that it's not going to help.
Around this point I remembered that back in my maven days I could shade dependencies during my build process. So I got to thinking that if I could shade the play-json 2.3.8 dependency in H that that would solve the problem. Since the problem seems to be that calling Json.* in H is using the Json object from play-json 2.4.6.
Unfortunately, the only thing I can find online that indicates the ability to shade is sbt-assembly. I found a great answer on how to do that for a fat jar. But I don't think I can use sbt-assembly because H isn't executable, it's just a library jar. I read through a question like my own but the answer refers to sbt-assembly so it doesn't help me.
Another question seems somewhat promising but I really can't follow how I would use it / where I would be placing the code itself. I also looked through the sbt manual, but nothing stuck out to me as being what I need.
I can't just change S to use play-json 2.4.6 because we're using H in a play 2.3 application as well. So it needs to be able to be used in both.
Right now the only thing I can really think to do if I can't get some kind of shading done is to make H not use S and to instead require some kind of serializer/deserializer implicitly and then wire in the appropriate json (dee)serializer. So here I am asking about how to properly shade with sbt with something that isn't an executable jar because I only want to do a re-write if I absolutely have to. If I missed something (like sbt-assembly being able to shade for non-executable jars as well), I'll take that as an answer if you can point me to the docs I must have missed.
As indicated by Yuval Itzchakov, sbt-assembly doesn't have to be building an executable jar and can shade library code as well. In addition, packing without transitive dependencies except the ones that need to be shaded can be done too and this will keep the packaged jar's size down and let the rest of the dependencies come through as usual.
Hunting down the transitive dependencies manually is what I ended up having to do, but if anyone has a way to do that automatically, that'd be a great addition to this answer. Anyway, this is what I needed to do to the H library's build file to get it properly shading the play-json library.
Figure out what the dependencies are using show compile:dependencyClasspath at the sbt console
Grab anything play related (since I'm only using play-json and no others I can assume play = needs shading)
Also shade the S models because they rely on play-json as well, so to avoid transitive dependencies bringing a non-shadded play 2.3.8 back in, I have to shade my serializers.
Add sbt-assembly to project and then update build.sbt file
build.sbt
//Shade rules for all things play:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("play.api.**" -> "shade.play.api.#1").inAll
)
//Grabbed from the "publishing" section of the sbt-assembly readme, excluding the "assembly" classifier
addArtifact(artifact in (Compile, assembly), assembly)
// Only the play stuff and the "S" serializers need to be shaded since they use/introduce play:
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
val toIncludeInPackage = Seq(
"play-json_2.11-2.3.8.jar",
"play-datacommons_2.11-2.3.8.jar",
"play-iteratees_2.11-2.3.8.jar",
"play-functional_2.11-2.3.8.jar",
"S_2.11-0.0.0.jar"
)
cp filter {c => !toIncludeInPackage.contains(c.data.getName)}
}
And then I don't get any exceptions anymore from trying to run it. I hope this helps other people with similar issues, and if anyone has a way to automatically grab dependencies and filter by them I'll happily update the answer with it.

How to use BLAS library in Spark?

I'm new to scala and I'm writing a Spark application in Scala and I need to use the axpy function from org.apache.spark.mllib.linalg.BLAS. However it looks to be not accessible to users. Instead I try to import the com.github.fomil.netlib and directly access it. But I could either. I need to multiply to DenseVector.
Right now, the BLAS class within mllib is marked private[spark] in the spark source code. What this means is that it is not accessible external to spark itself as you seem to have figured out. In short, you can't use it in your code.
If you want to use netlib-java classes directly, you need to add the following dependency to your project
libraryDependencies += "com.github.fommil.netlib" % "all" % "1.1.2" pomOnly()
That should allow you to import the BLAS class. Note, I haven't really tried to use it, but I am able to execute BLAS.getInstance() without a problem. There might be some complexities in the installation on some Linux platforms as described here - https://github.com/fommil/netlib-java.
Add mllib to your project
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.0"

Using a custom class loader for a module dependency in SBT

I have a multi-module SBT build consisting of api, core and third-party. The structure is roughly this:
api
|- core
|- third-party
The code for third-party implements api and is copied verbatim from somewhere else, so I don't really want to touch it.
Because of the way third-party is implemented (heavy use of singletons), I can't just have core depend on third-party. Specifically, I only need to use it via the api, but I need to have multiple, isolated copies of third-party at runtime. (This allows me to have multiple "singletons" at the same time.)
If I'm running outside of my SBT build, I just do this:
def createInstance(): foo.bar.API = {
val loader = new java.net.URLClassLoader("path/to/third-party.jar", parent)
loader.loadClass("foo.bar.Impl").asSubclass(classOf[foo.bar.API]).newInstance()
}
But the problem is that I don't know how to figure out at runtime what I should give as an argument to URLClassLoader if I'm running via sbt core/run.
This should work, though I didn't quite tested it with your setup.
The basic idea is to let sbt write the classpath into a file that you
can use at runtime. sbt-buildinfo
already provides a good basis for this, so I'm gonna use it here, but you
might extract just the relevant part and not use this plugin as well.
Add this to your project definition:
lazy val core = project enablePlugins BuildInfoPlugin settings (
buildInfoKeys := Seq(BuildInfoKey.map(exportedProducts in (`third-party`, Runtime)) {
case (_, classFiles) ⇒ ("thirdParty", classFiles.map(_.data.toURI.toURL))
})
...
At runtime, use this:
def createInstance(): foo.bar.API = {
val loader = new java.net.URLClassLoader(buildinfo.BuildInfo.thirdParty.toArray, parent)
loader.loadClass("foo.bar.Impl").asSubclass(classOf[foo.bar.API]).newInstance()
}
exportedProducts only contains the compiled classes for the project (e.g. .../target/scala-2.10/classes/). Depending on your setup, you might want to use fullClasspath instead
(which also contains the libraryDependencies and dependent projects) or any other classpath related key.

How to share code between project and build definition project in SBT

If I have written some source code in my build definition project (in /project/src/main/scala) in SBT. Now I want to use these classes also in the project I am building. Is there a best practice? Currently I have created a custom Task that copies the .scala files over.
Those seem like unnecessarily indirect mechanisms.
unmanagedSourceDirectories in Compile += baseDirectory.value / "project/src/main"
Sharing sourceDirectories as in extempore's answer is the simplest way to go about it, but unfortunately it won't work well with IntelliJ because the project model doesn't allow sharing source roots between multiple modules.
Seth Tisue's approach will work, but requires rebuilding to update sources.
To actually share the sources and have IntelliJ pick up on it directly, you can define a module within the build.
The following approach seems to only work in sbt 1.0+
Create a file project/metabuild.sbt:
val buildShared = project
val buildRoot = (project in file("."))
.dependsOn(buildShared)
and in your build.sbt:
val buildShared = ProjectRef(file("project"), "buildShared")
val root = (project in file("."))
.dependsOn(buildShared)
Then put your shared code in project/buildShared/src/main/scala/ and refresh. Your project will look something like this in IntelliJ:
Full example project: https://github.com/jastice/shared-build-sources
Can you make the following work? Put the source code for the classes in question should be part of your project, not part of your build definition; the “task which serializes a graph of Scala objects using Kryo and writes them as files into the classpath of the project” part sounds like a perfect job for resourceGenerators (see http://www.scala-sbt.org/0.13.2/docs/Howto/generatefiles.html). Then the only remaining problem is how to reference the compiled classes from your resource generator. I'm not familiar with Kryo. In order to use it, do you need to have the compiled classes on the classpath at the time your generator is compiled, or do they just need to be on the classpath on runtime? If the latter is sufficient, that's easier. You can get a classloader from the testLoader in Test key, load the class and instantiate some objects via reflection, and then call Kryo.
If you really need the compiled classes to be on the classpath when your resource generator is compiled, then you have a chicken and egg problem where the build can't be compiled until the project has been compiled, but of course the project can't be compiled before the build definition has been compiled, either. In that case it seems to me you have no choices other than:
1) the workaround you're already doing ("best practice" in this case would consist of using sourceGenerators to copy the sources out of your build definition and into target/src_managed)
2) put the classes in question in a separate project and depend on it from both your build and your project. this is the cleanest solution overall, but you might consider it too heavyweight.
Hope this helps. Interested in seeing others' opinions on this, too.