How can I include generated source files when my Play routes and views are compiled?
My Play 2.3 application uses a plugin that generates source files under a sourceManaged subdir (target/scala-2.11/src_managed/main/subdir). These source files include controllers and models that are referenced in my routes files and views. But when I compile my application, I get errors like this:
[error] myapp/conf/routes:14: object Contacts is not a member of package controllers
[error] GET /contacts controllers.Contacts.blank()
and this:
[error] myapp/app/views/contact/form.scala.html:1: not found: type Contact
[error] #(contactForm: Form[Contact])
[error] ^
Because controllers/Contacts.java and models/Contact.java reside under sourceManaged.
I've tried manually adding the appropriate managed sources subdir to sourceDirectory in Compile and javaSource in Compile in my build.sbt but it did not improve things.
I have considered making the managed source subdir a subproject and then using aggreate(), but it does not have the necessary build.sbt or project files -- it only has Java sources. And it seemed that making a managed source directory into a subproject from might be inappropriate. Should I reconsider this?
First make sure the plugin has a way to be added to sourceGenerators in Compile by your Play project. You can find how to do this here in the sbt documentation. I also have an example in a plugin I wrote, but note that it uses 0.12.x syntax.
Once you've done that, be sure one of your Play project build files adds the settings. This will be as simple as adding the name you used for the settings in the plugin to the build file, like this example of my plugin shows.
Related
In this question, there is discussion of how to include jar file into an sbt project. I need both a .jar file and some .so library files.
Attempt 1:
I can move the jar file into my sbt lib/ directory, which is great, except that this application has a small jar which is just a wrapper around C++ software. The stuff I want to do is in the .so library files, and if I move the jar file to ./lib by itself, I get linking errors:
sbt:SimpleProject> run linearSalience
[info] Running linearSalience linearSalience
[error] (run-main-0) java.lang.UnsatisfiedLinkError: no java_salience in java.library.path
[error] java.lang.UnsatisfiedLinkError: no java_salience in java.library.path
[error] at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
Attempt 2:
I have tried putting simlinks to the desired libraries into the lib/ folder, but that didn't work. I don't want to copy the entire library into the lib/ folder, even if that would work, since it is almost 2 GB; and it would be silly to copy it for each project.
Attempt 3:
I tried setting java.library.path through the javaOptions of sbt, by adding the following line to build.sbt
javaOptions in run += "-Djava.library.path=/opt/path/to/lib/:/opt/path/to/sdk/java/src/"
The first path contains the .so files, the second the .jar file. In this case, the compiler couldn't even find the packages (which was not a problem while the .jar file was in the lib/ folder of sbt):
[info] Compiling 1 Scala source to /opt/optests/sbttest/target/scala-2.10/classes ...
[error] /opt/optests/sbttest/src/main/scala/SimpleApp.scala:6:12: object lexalytics is not a member of package com
[error] import com.lexalytics.salience.{Salience, Section, Sentence, Word}
[error] ^
[error] /opt/optests/sbttest/src/main/scala/SimpleApp.scala:23:23: not found: type Salience
[error] val session = new Salience("/opt/path/to/license.v5", "/opt/path/to/data")
[error] ^
...etc
Attempt 4:
I try to set the LD_LIBRARY_PATH environment variable (as suggested here)
[user#server ~]$ echo $LD_LIBRARY_PATH
/opt/path/to/lib/:/opt/path/to/sdk/java/src/
The result is the same error as in 3
It seems like all the questions on this topic are resolved by either putting single jar files into lib/ or using managed dependencies (as here). But I have a local-only repository with no online support, that is more than a single .jar file.
Ultimately, I need to get the library directory into java.library.path, but how do I do that?
Note: This is not a duplicate of any question that deals with only .jar files and has no mention of .so files.
When you use a JNI wrapper over a native library on JVM (it doesn't matter on the language, it can be Java or Scala), the library jar file usually contains only the JNI glue code, defining how to map some java api calls to the native library calls. This jar is just a regular library, so it can be dropped to the /lib sbt folder as usual.
The native library itself should be present at runtime within the java.library.path, so you were quite close. As you suggested, you can add native SDK to the javaOptions in run (along with the jar in /lib folder) and it should work.
I have a spark project using scala and sbt. At one point it references a text file which I want to be packaged.
This is how it is referenced in the application source:
getClass.getResource("/myFile.txt")
This works fine running the source code with sbt run. But I want it to be packaged and deployed to a server.
In build.sbt, after some googling I have got this to work
import NativePackagerHelper._
mappings in Universal ++= directory("src/main/resources")
adding this meant that the myFile.txt appears in the resources folder in the package. created using
sbt universal:packageBin
resulting folder structure:
target - universal - bin
- lib
- resources
however when I run my packaged application from bin/my-application.bat , I get the following error
Exception in thread "main" org.apache.spark.sql.AnalysisException: Path does not exist: file:/C:/Software/my-application-0.0.1/lib/my-application-0.0.1.jar!/myFile.txt;
Bear in mind I have zero experience of deploying scala or jvm based things so you may have to spoonfeed me the explanation
EDIT I later realised that the text file was in fact included in the .jar file.
the issue then was that getResource does not work in this case and I had to adapt my code to use getResourceAsStream
This can have multiple reasons.
Include files in your resulting jar
You added this line, which is not correct
import NativePackagerHelper._
mappings in Universal ++= directory("src/main/resources")
The src/main/resources directory is the resourceDirectory in Compile and the contents are always present in the package jar file (not the zip!). So I would highly recommend removing this snippet as you will have your files twice in your classpath.
The mappings in Universal (documentation link) define the content of the created package (with universal:packageBin the zip file). I assume that you are using the JavaAppPackaging plugin, which configures your entire build. By default all dependencies and your actual build artifact end up in the libs folder. Start scripts are being place in bin.
The start scripts also create a valid classpath, which includes all libraries in lib and nothing else by default.
TL;DR You simply put your files in src/main/resources and they will be available on the classpath.
How to find files on the classpath
You posted this snippet
getClass.getResource("/myFile.txt")
This will lookup a file called myFile.txt in the roots of your classpath. As in the comment suggested you should open your application jar file and find a text file myFile.txt at the root, otherwise it won't be found.
hope that helps,
Muki
I have a multimodule scala project with the following structure -
-A/
-B/
-project/ (root project)
-build.sbt (this has build definition for all the subprojects)
I have an object declared in the project/ folder (lets call this object Dependencies) which contains various constants . Is it possible to access a variable declared in project/Dependencies.scala in scala code inside a subproject(A or B) without creating a dependency of any of the subprojects on the root project.
Please let me know if I need to clarify further.
If you want to make some code from your build definition available for the code in the project, you can use sbt-buildinfo plugin. It's mostly adapted for setting/task keys, but you can use it for any other values defined in your build too.
The way it works is very simple: it uses sourceGenerators to generate a piece of Scala code (with the values from the build) that will be available to the rest of your project sources. So if you don't want to use sbt-buildinfo, you can also generate sources directly. See sbt documentation on Generating files.
Our project features a kind of adhoc "plugin" that reads csv files and stuffs the contents into a database.
This code is defined in /project/DataMigrate.scala
We had our own poorly implemented version of a CSV parser that isn't up to the task anymore so I tried to add this https://github.com/tototoshi/scala-csv to the libraryDependencies in /project/Build.scala but that did not make it importable from DataMigrate.scala. I also tried putting the library dependency in /project/build.sbt as I read something about "Build definition of a build definition", but that did not seem to help either.
Is it at all possible to add dependencies for code that lives in /project?
SBT is recursive, so just as you can define dependencies and settings of the actual project in [something].sbt or project/[something].scala you can define dependencies and settings of the projects project (any ad hoc plugins etc) in project/[something].sbt or project/project/[something].scala
I am trying to follow the tutorial on compiling a simple DSL using Delite+LMS. I compiled LMS and Delite succesfully. Now, following this tutorial closely: http://stanford-ppl.github.io/Delite/myfirstdsl.html I run into problems when I try to build my profiling dsl. It seems that the compiler cannot find the delite-collection classes:
felix#felix-UX32VD:~/Documents/phd/delite/Delite$ sbt compile
Loading /home/felix/sbt/bin/sbt-launch-lib.bash
[info] Loading project definition from /home/felix/Documents/phd/delite/Delite/project
[info] Set current project to delite (in build file:/home/felix/Documents/phd/delite/Delite/)
[info] Compiling 5 Scala sources to /home/felix/Documents/phd/delite/Delite/dsls/profiling/target/scala-2.10/classes...
[error] /home/felix/Documents/phd/delite/Delite/dsls/profiling/src/example/profiling/Profile.scala:7: object DeliteCollection is not a member of package ppl.delite.framework.datastruct.scala
[error] import ppl.delite.framework.datastruct.scala.DeliteCollection
[error] ^
[error] /home/felix/Documents/phd/delite/Delite/dsls/profiling/src/example/profiling/Profile.scala:69: not found: type ScalaGenProfileArrayOps
[error] with ScalaGenDeliteOps with ScalaGenProfileOps with ScalaGenProfileArrayOps
[error]
^
Does someone have some insights to what I'm doing wrong?
From SBT manual:
Library dependencies can be added in two ways:
unmanaged dependencies are jars dropped into the lib directory
managed dependencies are configured in the build definition and downloaded
automatically from repositories (through Apache Ivy, exactly like Maven)
In any case, adding code inside a framework project is a bad idea, because you will have to change the build process (for example, adding an extra module). In addition, you might have to recompile all the code of the framework and this would be very slow.
The right way to make your code depending on a framework is:
Reference the library as a managed dependency available in some kind of repository (best solution).
Copy the jar inside the lib folder of your project and add it as an unmanaged dependency.
Since apparently Delite is not available on any Ivy repo, the best approach is to clone the Git repo and publish it locally. See http://www.scala-sbt.org/release/docs/Detailed-Topics/Publishing.html
Publishing Locally
The publishLocal command will publish to the local
Ivy repository. By default, this is in ${user.home}/.ivy2/local. Other
projects on the same machine can then list the project as a
dependency. For example, if the SBT project you are publishing has
configuration parameters like:
name := 'My Project'
organization := 'org.me'
version :=
'0.1-SNAPSHOT'
Then another project can depend on it:
libraryDependencies += "org.me" %% "my-project" % "0.1-SNAPSHOT"