building scala from source - scala

I'm trying to figure out how to build scala from a source code archive. I see a build.sbt file but if I don't have scala installed, so how to do build scala?
I also see a Gemfile, implying that there are Ruby bindings. I checked the README.md but there isn't any information there sadly.
I don't know what to do to start building.

I'm assuming that you're talking about Scala 2 https://github.com/scala/scala and not Scala 3 https://github.com/lampepfl/dotty since you mentioned Gemfile (https://github.com/scala/scala/blob/2.13.x/Gemfile).
The Gemfile is for Travis CI. So you can ignore it.
If you can see build.sbt then in order to build a project you need JVM and sbt installed, not Scala
https://docs.scala-lang.org/getting-started/index.html
I checked the README.md but there isn't any information there sadly.
Actually, there is all necessary information in README:
https://github.com/scala/scala#using-the-sbt-build
sbt dist/mkBin generates runner scripts (scala, scalac, etc) in build/quick/bin
sbt dist/mkPack creates a build in the Scala distribution format in build/pack

this tool (https://scala-cli.virtuslab.org/) will let you easily download a working scala compiler & associated tools. they, in turn, will make it possible to build a compiler from the source tree. i'm assuming you've cloned https://github.com/lampepfl/dotty (scala 3)?

Related

What is the purpose of `scala-2.11` folder in IntelliJ IDEA

scala-2.11 folder appeared after recent update of IDEA and Scala plugin.
What should it be used for?
Usually such directories are used for binary version-dependent code. For example, macros in 2.10 are not source-compatible with macros in 2.11, so if you're building your project for different binary versions and you're using macros, it makes sense to put code which is only valid for the specific version in different source roots. SBT then will use the appropriate directory when compiling for 2.10 or 2.11.
If you're using SBT, though, you would need to set such thing up manually in the build definition. If you're not using SBT, then probably IDEA plugin was updated to handle such things by itself.

What is Scala's Simple Build Tool (sbt) and why is it used?

I am new in Scala and I have to learn Scala and SBT, I read the sbt tutorial but i am unable to understand the use of sbt, for what purpose its been used.After reading this tutorial
I am still confused can any one will explain it in simple words, also suggest me if there is some tutorial for simple build tool
When you write small programs that consist of only one, or just two or three source files, then it's easy enough to compile those source files by typing scalac MyProgram.scala in the command line.
But when you start working on a bigger project with dozens or maybe even hundreds of source files, then it becomes too tedious to compile all those source files manually. You will then want to use a build tool to manage compiling all those source files.
sbt is such a tool. There are other tools too, some other well-known build tools that come from the Java world are Ant and Maven.
How it works is that you create a project file that describes what your project looks like; when you use sbt, this file will be called build.sbt. That file lists all the source files your project consists of, along with other information about your project. Sbt will read the file and then it knows what to do to compile the complete project.
Besides managing your project, some build tools, including sbt, can automatically manage dependencies for you. This means that if you need to use some libraries written by others, sbt can automatically download the right versions of those libraries and include them in your project for you.

Scala dependency on Spark installation

I am just getting started with Spark, so downloaded the for Hadoop 1 (HDP1, CDH3) binaries from here and extracted it on a Ubuntu VM. Without installing Scala, I was able to execute the examples in the Quick Start guide from the Spark interactive shell.
Does Spark come included with Scala? If yes, where are the libraries/binaries?
For running Spark in other modes (distributed), do I need to install Scala on all the nodes?
As a side note, I observed that Spark has one of the best documentation around open source projects.
Does Spark come included with Scala? If yes, where are the libraries/binaries?
The project configuration is placed in project/ folder. I my case here it is:
$ ls project/
build.properties plugins.sbt project SparkBuild.scala target
When you do sbt/sbt assembly, it downloads appropriate version of Scala along with other project dependencies. Checkout the folder target/ for example:
$ ls target/
scala-2.9.2 streams
Note that Scala version is 2.9.2 for me.
For running Spark in other modes (distributed), do I need to install Scala on all the nodes?
Yes. You can create a single assembly jar as described in Spark documentation
If your code depends on other projects, you will need to ensure they are also present on the slave nodes. A popular approach is to create an assembly jar (or “uber” jar) containing your code and its dependencies. Both sbt and Maven have assembly plugins. When creating assembly jars, list Spark itself as a provided dependency; it need not be bundled since it is already present on the slaves. Once you have an assembled jar, add it to the SparkContext as shown here. It is also possible to submit your dependent jars one-by-one when creating a SparkContext.
Praveen -
checked now the fat-master jar.
/SPARK_HOME/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar
this jar included with all the scala binaries + spark binaries.
you are able to run because this file is added to your CLASSPAH when you run spark-shell
check here : run spark-shell > http:// machine:4040 > environment > Classpath Entries
if you downloaded pre build spark , then you don't need to have scala in nodes, just this file in CLASSAPATH in nodes is enough.
note: deleted the last answer i posted, cause it may mislead some one. sorry :)
You do need Scala to be available on all nodes. However, with the binary distribution via make-distribution.sh, there is no longer a need to install Scala on all nodes. Keep in mind the distinction between installing Scala, which is necessary to run the REPL, and merely packaging Scala as just another jar file.
Also, as mentioned in the file:
# The distribution contains fat (assembly) jars that include the Scala library,
# so it is completely self contained.
# It does not contain source or *.class files.
So Scala does indeed come along for the ride when you use make-distribution.sh.
From spark 1.1 onwards, there is no SparkBuild.scala
You ahve to make your changes in pom.xml and build using Maven

How to set up a scala project in IntelliJ IDEA that uses git libraries

I would like to give IntelliJ IDEA a try, but I have no idea how to get going.
I am simply trying to create a new project that uses Finagle Echo Server, hosted on github, as starting point.
Assuming I'm starting with a clean install on Mac. I installed IDEA and added the Scala and SBT plugins. What steps should I take to create a project, that uses Finagle and run the code in the http server example?
PLEASE help. I realize my question sounds like a stupid question, but there are so many different approaches to working with Scala projects from SBT command line, Scala-IDE, Idea, etc, that I simply don't know how to get a comfortable development environment going.
A manual solution that doesn't require you to use SBT for your project might be more straightforward, given the SBT versioning issues. You'll still use SBT to build finagle, though.
Install the SBT runner script per step 1 above. (It can handle SBT 0.7 projects too).
Manually git clone git://github.com/twitter/finagle.git.
cd to the finagle directory and type "sbt package". Its dependencies should end up under lib_managed, and it should produce the finagle jars themselves under target/ or some such (just note the locations in the command output).
Create an IDEA project from scratch, and manually add dependencies to the finagle jars and their dependencies (under Project Structure->Dependencies).
This answer assumes that you want to use SBT. Also, I should qualify that this is my usual procedure, but I haven't confirmed that it works with finagle in particular.
0. Install IDEA, with Scala and SBT plugins. (Done by the OP; here for others)
1. Install SBT (automatic method). Copy this handy sbt runner script to a convenient location (or, if you want to keep it up to date, git clone https://github.com/paulp/sbt-extras.git and symlink the script into ~/bin), and make sure it's executable. It will automatically download whatever it needs based on the sbt.version specified in your build.properties.
2. Install sbt-idea. sbt-idea is an SBT plugin (not an IDEA plugin) that generates IDEA module files from an SBT project. It's convenient to install this globally since it's not project-specific. You don't have to download anything manually; just add this to ~/.sbt/plugins/build.sbt:
resolvers += "sbt-idea-repo" at "http://mpeltonen.github.com/maven/"
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "0.11.0")
3. Create SBT project. Create a directory for your project, and a "project" directory within it. Create project/Build.scala as follows:
import sbt._
object MyBuild extends Build {
lazy val root = Project("root", file(".")) dependsOn finagle
lazy val finagle = RootProject(uri("git://github.com/twitter/finagle.git"))
}
See the SBT documentation for lots more options re configuring your project. Note we have to use the Full Configuration here (not just build.sbt) in order to express the github dependency.
It's also a good idea to create project/build.properties:
sbt.version=0.11.2
project.version=0.1
build.scala.versions=2.9.1
4. Generate IDEA project. cd to the directory containing the sbt-based project. type "sbt gen-idea". If all goes well, the directory will now have ".idea" and ".idea_modules" subdirectories.
5. Open the project in IDEA. It may be necessary to fix the target JDK version in the project settings. Aside from that, the project should be ready to go, with all source paths, library dependencies, etc. properly configured.

sbt and antlr, got simple example?

Does anyone have an example of how to set up sbt to build an ANTLR file (to scala) and then compile the resulting code.
My file layout
src/main/scala/Test.scala // scala test rig
src/main/scala/Test.g // antlr grammar
build/antlr/TestParser.scala // antlr output files
build/antlr/TestLexer.scala
What should my sbt contain? I know there's a plugin out there for pulling in the rules for ANTLR, but I haven't been able to make it work. (Still newbie to this world)
I've written a sbt plugin to generate the parser and lexer code from a provided antlr grammer file. You can download the code on my github page http://github.com/stefri/sbt-antlr. It's also listed in the sbt plugin list https://github.com/harrah/xsbt/wiki/sbt-0.10-plugins-list. The latest snapshot uses ANTLR 3.3 and is available via my github maven repository for the sbt 0.11.x series. If you need another ANTLR version it's easy to change and rebuild, I'm still working on the configuration options.
The usage is quite simple, just include the the plugin and my maven repository in ./project/plugins/build.sbt
resolvers += "stefri" at "http://stefri.github.com/repo/snapshots"
addSbtPlugin("com.github.stefri" % "sbt-antlr" % "0.2-SNAPSHOT")
then place your ANTLR3 grammar files in src/main/antlr3. They will be
included in your next build.
Make sure you also include the plugins settings sbtantlr.SbtAntlrPlugin.antlrSettings in your project settings, e.g if you are using the simple configuration approach add the following line
seq(sbtantlr.SbtAntlrPlugin.antlrSettings: _*)
to your build.sbt file. Note, sbt-antlr generates the source code only once as long as your grammar file didn't change it does not
re-generate the java source files.
The generated java files are spit out to target/scala-2.9.1/src_managed/main/antlr3, so make sure you inlcude that path in your IDE's build path. The plugin is still work in progress, but it already works quite nice with my grammars.