how to create project files in sbt - scala

hi i am new in sbt i am following this tutorial
http://www.scala-sbt.org/0.13/tutorial/Hello.html
i followed the same steps on the shell the program displays "hi"
i am confused i don't have these files in my hello folder
Sources in src/main/scala or src/main/java
Tests in src/test/scala or src/test/java
Data files in src/main/resources or src/test/resources
jars in lib
and also i dont't have the build.sbt file i am following this tutorial as it is i have only hw.scala file and a target folder
mt scala version is 2.11.1 and sbt version is sbt 0.13.5
am i doing something wrong ?

Just create build.sbt and write appropriate lines into it. Same goes for mentioned directories -- stock sbt does not create files and folders for you, but it has to recognize them, once they're there.

Related

how to package and reference text file in resources folder

I have a spark project using scala and sbt. At one point it references a text file which I want to be packaged.
This is how it is referenced in the application source:
getClass.getResource("/myFile.txt")
This works fine running the source code with sbt run. But I want it to be packaged and deployed to a server.
In build.sbt, after some googling I have got this to work
import NativePackagerHelper._
mappings in Universal ++= directory("src/main/resources")
adding this meant that the myFile.txt appears in the resources folder in the package. created using
sbt universal:packageBin
resulting folder structure:
target - universal - bin
- lib
- resources
however when I run my packaged application from bin/my-application.bat , I get the following error
Exception in thread "main" org.apache.spark.sql.AnalysisException: Path does not exist: file:/C:/Software/my-application-0.0.1/lib/my-application-0.0.1.jar!/myFile.txt;
Bear in mind I have zero experience of deploying scala or jvm based things so you may have to spoonfeed me the explanation
EDIT I later realised that the text file was in fact included in the .jar file.
the issue then was that getResource does not work in this case and I had to adapt my code to use getResourceAsStream
This can have multiple reasons.
Include files in your resulting jar
You added this line, which is not correct
import NativePackagerHelper._
mappings in Universal ++= directory("src/main/resources")
The src/main/resources directory is the resourceDirectory in Compile and the contents are always present in the package jar file (not the zip!). So I would highly recommend removing this snippet as you will have your files twice in your classpath.
The mappings in Universal (documentation link) define the content of the created package (with universal:packageBin the zip file). I assume that you are using the JavaAppPackaging plugin, which configures your entire build. By default all dependencies and your actual build artifact end up in the libs folder. Start scripts are being place in bin.
The start scripts also create a valid classpath, which includes all libraries in lib and nothing else by default.
TL;DR You simply put your files in src/main/resources and they will be available on the classpath.
How to find files on the classpath
You posted this snippet
getClass.getResource("/myFile.txt")
This will lookup a file called myFile.txt in the roots of your classpath. As in the comment suggested you should open your application jar file and find a text file myFile.txt at the root, otherwise it won't be found.
hope that helps,
Muki

How to add external jar files to a spark scala project

I am trying to use an LSH implementation of Scala(https://github.com/marufaytekin/lsh-spark) in my Spark project.I cloned the repository with some changes to the sbt file (added Organisation)
To use this implementation , I compiled it using sbt compile and moved the jar file to the "lib" folder of my project and updated the sbt configuration file of my project , which looks like this ,
Now when I try to compile my project using sbt compile , It fails to load the external jar file ,showing the error message "unresolved dependency: com.lendap.spark.lsh.LSH#lsh-scala_2.10;0.0.1-SNAPSHOT: not found".
Am i following the right steps for adding an external jar file ?
How do i solve the dependency issue
As an alternative, you can build the lsh-spark project and add the jar in your spark application.
To add the external jars, addJar option can be used while executing spark application. Refer Running spark application on yarn
This issue isn't related to spark but to sbt configuration.
Make sure you followed the correct folder structure imposed by sbt and added your jar in the lib folder, as explained here - lib folder should be at the same level as build.sbt (cf. this post).
You might also want to check out this SO post.

sbt dependency root catalog differs from intellij libraries after import

I'm using Scala plugin version 1.3.3 for Intellij 14.0.3 at my work computer and have started a new SBT project from scratch without any hassle.
But my problems starts right here where build.sbt file have compile errors in Intellij, it can not resolve any line of code. I can how ever auto import by changing the build.sbt file and adding a library dependencies..
So I tested to add scalatest which is downloaded to an .ivy2 directory, but a totaly different one that Intellij are using.
This is how ever how my project structure are looking like, every libs have an error due to wrong path.
And here is where SBT plugin is locating all of it's dependencies and scala libs.
I know how to change where .ivy2 directory will be stored by adding these two parameters to
Settings -> Build,Execution, Deployment -> Build Tools -> SBT -> JVM Options -> VM parameters
-Dsbt.ivy.home=c:/.ivy2
-Dsbt.home=c:/.ivy2
But it only works for .iv2 folders and not the .sbt folders that are also in the wrong place. I believe that's the cause of why I can't resolve the symbols in the build.sbt script.
Does anyone know why this is happening and how I just can have one directory for both Scala plugin and Itellij project files?

Folders/packages in sbt ./project folder

Looks strange to me, maybe i'm doing something wrong, but when i'm trying to launch sbt it can't find/compile files from folders/packages inside ./project folder, like:
root/
project/
deploy/DeployModule.scala
DeployConfig.scala
Build.scala
SBT can't resolve by build.deploy.DeployModule._ import, but if i move files from deploy folder into project folder it works. So it looks like SBT can't resolve files in nested folders inside project folder?
Sbt's meta build uses the default sbt build (with a few minor extras). As such, root-level .scala/.java files are picked up, but if you want things in sub-directories, you'll need to place them like so:
root/
project/
src/main/scala/deploy/
DeployModule.scala
DeployConfig.scala
Build.scala

sbt eclipse command changed files and packages

I created a new Scala project in eclipse then added a package and Scala object ,
So far so good ...
i want to add external library so i added a project folder with build.properties plugins.sbt files,and another file build.sbt in the root project.
in the terminal i compiled successfully the project with the sbt compile task.
the problem is that after sbt eclipse command the eclipse project changed from Scala project to something else.... all the packages changed to simple folders and the Scala project is ruined
scala IDE :Build id: 3.0.3-20140327-1716-Typesafe
scala version :2.10.4
sbt version:0.13.0
you can see in the image
Where did you get the eclipse command from? I'm guessing you're using sbteclipse.
I created a new Scala project in eclipse then added a package and Scala object , So far so good ...
If I understand correctly, this is exactly the opposite of what the plugin is intended to do. I think you're suppose to create a plain sbt project, and then let the plugin generate the Eclipse project.