intellij: 'sbt' is not recognized - scala

I have installed sbt plugin in intellij, but whne I try to run some sbt command, it is not recognized.
>sbt clean compile
'sbt' is not recognized as an internal or external command,
operable program or batch file.
Shouldn't sbt plugin take care of this?

Related

Compiling with scalac does not find sbt dependencies

I tried running my Scala code in VSCode editor. I am able to run my script via spark-submit command. But when I am trying with scalac to compile, I am getting:
.\src\main\scala\sample.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.{SQLContext,SparkSession}
I have already added respective library dependencies to build.sbt.
Have you tried running sbt compile?
Running scalac directly means you're compiling only one file, without the benefits of sbt and especially the dependencies that you have added in your build.sbt file.
In a sbt project, there's no reason to use scalac directly. This defeats the purpose of sbt.

how to run scala .jar with external jar files in terminal

I have my .jar built from scala classes and it has an external dependency with other.jar. Please suggest how should I run my jar files in terminal. The command I tried is
$scala my_scala.jar external.jar
It works same way as running java program. Try this
scala -classpath <your_scala_jar>:<external_jar> <package.MainClass>

Compile single scala file with TypeSafe Activator

I have Activator installed. Which means I have a full SBT on my system. I don't want to create a brand new activator project. All I want to do is compile a single scala file as we used to do with the scalac command. How can I do this please? Thanks.
You go into the directory containing your scala file and type "sbt compile" on the command line.
To run the program, you type "sbt run"
see also
http://www.scala-sbt.org/0.13/tutorial/Hello.html

Running Spark sbt project without sbt?

I have a Spark project which I can run from sbt console. However, when I try to run it from the command line, I get Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext. This is expected, because the Spark libs are listed as provided in the build.sbt.
How do I configure things so that I can run the JAR from the command line, without having to use sbt console?
To run Spark stand-alone you need to build a Spark assembly.
Run sbt/sbt assembly on the spark root dir. This will create: assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
Then you build your job jar with dependencies (either with sbt assembly or maven-shade-plugin)
You can use the resulting binaries to run your spark job from the command line:
ADD_JARS=job-jar-with-dependencies.jar SPARK_LOCAL_IP=<IP> java -cp spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar:job-jar-with-dependencies.jar com.example.jobs.SparkJob
Note: If you need other HDFS version, you need to follow additional steps before building the assembly. See About Hadoop Versions
Using sbt assembly plugin we can create a single jar. After doing that you can simply run it using java -jar command
For more details refer

error message with scala installation

I would like to install scala 2.8.1. I used exactly all the steps from
http://www.scala-lang.org/node/310 but when i write 'scala' in the command prompt then i have the following error
'java' is not recognized as an internal or external command operable program or batch file
Do you know whats going wrong?
As the error message is pointing out, java is missing. You have to install a JRE or JDK and make Scala find it, for example by setting JAVA_HOME to the JRE/JDK.
Have fun with Scala