Use SBT to launch an applicative task/batch? - scala

Can SBT be used to launch applicative Scala batches?
I mean, a Scala applicative code that would decrement all the user's balance for exemple.
I can imagine a special SBT project for that with a build dependency on the applicative code, so that some SBT tasks of the project can launch this applicative code.
Would you do such a thing? Why?
What are the alternatives to ease the launch of applicative batches in Scala? With Maven I used to use the appassembler plugin which would generate sh scripts and all the classpath related stuff.

A simple way is to use sbt run.
object Main {
def main(args: Array[String]) {
// decrement all user balances.
};
};
Then sbt run from shell, jenkins, etc.
Use command line args to expand the functionality.

Related

Running multiple Scala apps from one jar on JVM

I have a Scala application that successfully runs on the JVM using an uber jar via the command: java -jar myapp.jar. I need to create a separate, but related Scala job that utilizes many of the same objects/functions/dependencies as the first, making it a great candidate to keep in the same code repository & uber jar. Please note that these Scala jobs do not utilize Spark, so spark2-submit is out of the equation.
Question: How can I run 2 separate Scala jobs from the same uber jar on the JVM? (I am using Scala 2.11.8 and SBT for jar assembly)
Additional Context:
I've already looked into related StackOverflow discussions, namely this post about specifying Java classes using java -cp myapp.jar MyClass and this post, which only presented the solution of running the Scala equivalent using scala -classpath myapp.jar MyClass.
While the scala -classpath solution may have worked for the OP of the second linked discussion, I'll be deploying my code to an environment that doesn't have executables for scala or sbt, only java.
Let's say these are the 2 Scala jobs I want to run:
// MyClass.scala
package mypackage
object MyClass {
def main(args: Array[String]): Unit = {
println("Hello, World!")
}
}
// MyClass2.scala
package mypackage
object MyClass2 {
def main(args: Array[String]): Unit = {
println("Hello, World! This is the second job!")
}
}
Is there a way to run Scala code using java -cp myapp.jar MyClass?
I've tried this and receive the following error:
Error: Could not find or load main class MyClass
The main alternative I can think of would be to create a Scala object that serves as a main entry point and takes a parameter to determine which job gets run. I'd like to avoid that solution if possible, but it would allow me to continue using java -jar myapp.jar, which has been working fine.
You need to use a fully qualified name for the App instance:
java -cp myapp.jar mypackage.MyClass

Possible to compile and run scala-akka scripts from command-line without build tools like sbt?

I am a beginner learning to program in scala-akka and I have had no problems running my scripts on IntelliJ IDE / and 'sbt run'. However, I can't seem to find any resources that teaches me how to manually use scalac and the akka jar dependency to compile and run just from the command-line. Can anyone point me in the right direction?
Let's assume you have Scala and Akka installed somewhere under /home/leo/apps/ and Scala binaries are searchable (e.g. export PATH=$PATH:home/leo/apps/scala-2.11.8/bin)
Next, let's say you have a Scala main app Tweets.scala along with a few supplementary classes packaged in akkastreams under /home/leo/myproject/:
akkastreams/
Tweets.scala
Author.scala
HashTag.scala
Message.scala
...
Here's how you'll compile and run the app:
cd /home/leo/myproject/
# Compile all files in package akkastreams:
scalac -cp "/home/leo/apps/akka-2.4.9/lib/akka/*" akkastreams/*.scala
# Run the main app Tweets (object Tweets extends App):
# Note that classpath includes also current subdir '.'
scala -cp "/home/leo/apps/akka-2.4.9/lib/akka/*:." akkastreams.Tweets
A few notes:
You could include only specific Akka jars instead of all of them.
Without dependencies and versioning being managed by sbt, you'll need to manually maintain version consistency between Scala's bundled Akka libraries versus Akka's own ones.
While it's a good exercise to see how things are done in a crude way, it's obviously unproductive to do this on a regular basis.
In my opinion You should perform scalac and scala with classpath parameter and selected library jar file.
By the way it's still more convenient to use sbt.

How do I distribute a Scala macro as a project?

Suppose I have a Scala compile-time macro that I find useful and would like to share it (I do). How do I create a JAR file that when loaded into another project would execute the macro when compiling the new project?
Specifically, I've made a StaticAnnotation that rewrites the AST of the class that it wraps before compile time. This works in my Maven build (macro defined in the main directory, runs on test cases in the test directory) because I have
<compilerPlugins>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_2.10.5</artifactId>
<version>2.1.0-M5</version>
</compilerPlugin>
</compilerPlugins>
in my scala-maven-plugin. (I'm starting with a Scala 2.10 project and if it works, will provide both 2.10 and 2.11.)
But if I put the resulting JAR on a Scala console classpath, in a Scala script, or into another Maven project (without special compiler plugins), it simply ignores the macro: the AST does not get overwritten and my compile-time println statements don't execute. If I use the #compileTimeOnly annotation on my macro (new in Scala 2.11), then it complains with the #compileTimeOnly error message.
Do I really need to tell my users to add compiler plugins in their pom.xml files, as well as alternate instructions for SBT and other build tools? Other packages containing macros (MacWire, Log4s) don't come with complicated build instructions: they just say, "point to this dependency in Maven Central." I couldn't find the magic in their build process that makes this work. What am I missing?
If you're relying on a macro-paradise-only feature then yes, you do need to tell your users to add compiler plugins. See http://docs.scala-lang.org/overviews/macros/annotations.html . The projects you mention are only using the scala compiler's built-in (non-paradise) macro features, not macro annotations.

How do I set up databinder dispatch to use in Eclipse?

I want to write some simple HTTP requests in Scala, but the Databinder Dispatch library only has instructions for sbt. As I'm a relative Eclipse newbie, can someone provide instructions on how I use it in my Scala project in Eclipse?
I'm using Scala 2.9.0final. If it's incompatible with Dispatch, is there an alternative HTTP request library?
http://dispatch.databinder.net/Try+Dispatch.html
Thanks!
The page you have linked to has instructions for trying out Dispatch using the sbt console. It is much easier to just do that on the command line, although if you are convinced to do this with Eclipse you can read your integration options.
If you want to set up a project and write some code that can be compiled that uses Dispatch, you should follow this guide.
Which shows you how to pull in dispatch as a dependency with either Maven or sbt. The main thing is that you want the dispatch + dependencies jars on your project classpath in Eclipse before you can start playing with it - sbt makes this easy in Scala and Maven for Java. So you should look around for how to do that in Eclipse to see your options.
Dispath is build on top of Apache HttpClient which is pure Java library (so can be used from Scala). But if you want to use Dispatch, you can:
Git clone the example
Install sbt
Run sbt update
Look into lib_managed/scala_${version}/compile dir
Write a sample scala script Script.scala:
import dispatch._
val h = new Http
val req = url("http://www.scala-lang.org/")
val handler = req >>> System.out
h(handler)
h(url("http://www.scala-lang.org/") >>> System.out)
Run a script with proper class path. On linux you can do
scala -cp `echo lib_managed/scala_${version}/compile/*.jar | sed 's/ /:/g'` Script.scala
Enjoy!)

Run tests in broken project using SBT

When doing a serious refactor in a Java Eclipse project I will often break the build, but focus on getting one test to pass at a time. When running the tests Eclipse warns that the project cannot be compiled, but it will still run the tests it can compile.
Now I'm using SBT and would like to achieve the same thing with 'test-only', but it tries to compile the whole project, fails, and doesn't run the tests. How can I tell it to just compile the bits it can and run the tests.
You should add the following task to your project definition:
import sbt._
class Project(info: ProjectInfo) extends DefaultProject(info) {
lazy val justTest = testTask(testFrameworks, testClasspath, testCompileConditional.analysis, testOptions)
}
This is the same as the ordinary test task, but has no dependencies attached at the end. If you'd like it to have dependencies, call dependsOn on the testTask(...) expression and provide the tasks you want it to depend on.
testTask(testFrameworks, testClasspath, testCompileConditional.analysis, testOptions).dependsOn(testCompile, copyResources, copyTestResources)