how to import a class in scala - scala

I am learning scala recently, the package in scala confused me.
I have a file named StockPriceFinder.scala:
// StockPriceFinder.scala
object StockPriceFinder {
def getTickersAndUnits() = {
val stockAndUnitsXML = scala.xml.XML.load("stocks.xml")
(Map[String, Int]() /: (stocksAndUnitsXML \ "symbol")) {
(map, symbolNode) =>
val ticker = (symbolNode \ "#ticker").toString
val units = (symbolNode \ "units").text.toInt
map ++ Map(ticker -> units)
}
}
}
then I want to use StockPriceFinder in test.scala which is in the same folder:
val symbolAndUnits = StockPriceFinder.getTickersAndUnits
but when I run it with scala test.scala, I got error:error: not found: value StockPriceFinder. In Java, if this two source files are in the same folder, I do not need to import and I can use it directly, so how can I import StockPriceFinder correctly in scala?
I have tried to use import StockPriceFinder in test.scala, but it still does not work.

You don't need to import StockPriceFinder if the files are in the same package (not folder).
But you do need to compile StockPriceFinder.scala first and pass the correct classpath to scala.
scalac StockPriceFinder.scala
scala -cp . test.scala
should work (might be a bit off). However, you shouldn't do it manually, since it becomes unmanageable very quickly; use SBT or other build tools (Maven, Gradle).

Related

Autofix directory structure based on package in scala

I have a file src/main/scala/foo.scala which needs to be inside package bar. Ideally the file should be inside src/main/scala/bar/foo.scala.
// src/main/scala/foo.scala
package bar
// ...
How can I auto-fix this issue throughout my project such that the folder structure matches the package structure?
Is there any SBT plugin etc that can help me fix this issue?
As far as I am aware there are not such tools, though AFAIR IntelliJ can warn about package-directory mismatch.
Best I can think if is custom scalafix (https://scalacenter.github.io/scalafix/) rule - scalafix/scalameta would be used to check file's actual package, translate it to an expected directory and if they differ, move file.
I suggest scalafix/scalameta because there are corner cases like:
you are allowed to write your packages like:
package a
package b
package c
and it almost like package a.b.c except that it automatically imports everything from a and b
you can have package object in your file and then if you have
package a.b
package object c
this file should be in a/b/c directory
so I would prefer to check if file didn't fall under any of those using some existing tooling.
If you are certain that you don't have such cases (I wouldn't without checking) you could:
match the first line with regexp (^package (.*))
translate a.b.c into a/b/c (matched.split('.').map(_.trim).mkString(File.separator))
compare generated location to an actual location ( I suggest resolving absolute file locations)
move file if necessary
If there is a possibility of having more complex case than that, I could replace first step by querying scalafix/scalameta utilities.
Here is an sbt plugin providing packageStructureToDirectoryStructure task that reads package statements from source files, creates corresponding directories, and then moves files to them
import sbt._
import sbt.Keys._
import better.files._
object PackagesToDirectories extends AutoPlugin {
object autoImport {
val packageStructureToDirectoryStructure = taskKey[Unit]("Make directory structure match package structure")
}
import autoImport._
override def trigger = allRequirements
override lazy val projectSettings = Seq(
packageStructureToDirectoryStructure := {
val log = streams.value.log
log.info(s"Refactoring directory structure to match package structure...")
val sourceFiles = (Compile / sources).value
val sourceBase = (Compile / scalaSource).value
def packageStructure(lines: Traversable[String]): String = {
val packageObjectRegex = """package object\s(.+)\s\{""".r
val packageNestingRegex = """package\s(.+)\s\{""".r
val packageRegex = """package\s(.+)""".r
lines
.collect {
case packageObjectRegex(name) => name
case packageNestingRegex(name) => name
case packageRegex(name) => name
}
.flatMap(_.split('.'))
.mkString("/")
}
sourceFiles.foreach { sourceFile =>
val packagePath = packageStructure(sourceFile.toScala.lines)
val destination = file"$sourceBase/$packagePath"
destination.createDirectoryIfNotExists(createParents = true)
val result = sourceFile.toScala.moveToDirectory(destination)
log.info(s"$sourceFile moved to $result")
}
}
)
}
WARNING: Make sure to backup the project before running it.

List sbt `Task` tags

Is there a way to list the tags associated with a Task in sbt?
inspect and show don’t seem to have anything there.
The cool and powerful aspect of sbt is that the build definition it generates is a regular Scala application meaning we can inspect its objects, like we would in any other Scala application, by simply invoking member methods to query their state. Executing sbt starts the REPL for the special DSL build language, however we can drop to a lower level by executing
sbt consoleProject
to start true Scala REPL:
starts the Scala interpreter with access to your project definition
and to sbt... consoleProject can be useful for creating and
modifying your build in the same way that the Scala interpreter is
normally used to explore writing code. Note that this gives you raw
access to your build.
There exists a public tags method
final case class Task[T](info: Info[T], work: Action[T]) {
...
def tags: TagMap = info get tagsKey getOrElse TagMap.empty
}
so there must be a way to invoke it (even if there might not be a ready top level command for it such as inspect). Say we have the following tagged task definition in build.sbt
lazy val hello = taskKey[Unit]("Vulcan greeting")
hello := Def.task(println("Live long and prosper")).tag(Tags.CPU, Tags.Compile).value
After executing consoleProject our build definition is imported
scala> import _root_.scala.xml.{TopScope=>$scope}
import _root_.sbt._
import _root_.sbt.Keys._
import _root_.sbt.nio.Keys._
import _root_.sbt.ScriptedPlugin.autoImport._
import _root_.sbt.plugins.IvyPlugin
import _root_.sbt.plugins.JvmPlugin
import _root_.sbt.plugins.CorePlugin
import _root_.sbt.ScriptedPlugin
import _root_.sbt.plugins.SbtPlugin
import _root_.sbt.plugins.SemanticdbPlugin
import _root_.sbt.plugins.JUnitXmlReportPlugin
import _root_.sbt.plugins.Giter8TemplatePlugin
import $d408b7d79eabe42459a4.root
import currentState._
import extracted._
import cpHelpers._
Now we can make use of Extracted#get to get the TaskKey and explore it like so
scala> extracted.get(hello).tags
res1: sbt.ConcurrentRestrictions.TagMap = Map(Tag(cpu) -> 1, Tag(compile) -> 1)
Furthermore, note the import $d408b7d79eabe42459a4. We can use this object to access regular val/def members, for example, say we had defined in build.sbt
def helloTask = Def.task { println("Live long and prosper") } tag(Tags.CPU, Tags.Compile)
then we could access helloTask like so
scala> $d408b7d79eabe42459a4.helloTask.evaluate(structure.data).tags
res0: sbt.ConcurrentRestrictions.TagMap = Map(Tag(cpu) -> 1, Tag(compile) -> 1)
Both approaches show the required Map(Tag(cpu) -> 1, Tag(compile) -> 1).
Addressing the comment compileTask does not seem to be tagged thus
scala> get(Compile/compile).tags
res8: sbt.ConcurrentRestrictions.TagMap = Map()
however, for example, updateFull task is indeed tagged
updateFull := (updateTask tag (Tags.Update, Tags.Network)).value
hence
scala> get(updateFull).tags
res9: sbt.ConcurrentRestrictions.TagMap = Map(Tag(update) -> 1, Tag(network) -> 1)

main class not found in spark scala program

//package com.jsonReader
import play.api.libs.json._
import play.api.libs.json._
import play.api.libs.json.Reads._
import play.api.libs.json.Json.JsValueWrapper
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.SQLContext
//import org.apache.spark.implicits._
//import sqlContext.implicits._
object json {
def flatten(js: JsValue, prefix: String = ""): JsObject = js.as[JsObject].fields.foldLeft(Json.obj()) {
case (acc, (k, v: JsObject)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
acc.deepMerge(flatten(v, nk))
}
case (acc, (k, v: JsArray)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
val arr = flattenArray(v, nk).foldLeft(Json.obj())(_++_)
acc.deepMerge(arr)
}
case (acc, (k, v)) => {
val nk = if(prefix.isEmpty) k else s"$prefix.$k"
acc + (nk -> v)
}
}
def flattenArray(a: JsArray, k: String = ""): Seq[JsObject] = {
flattenSeq(a.value.zipWithIndex.map {
case (o: JsObject, i: Int) =>
flatten(o, s"$k[$i]")
case (o: JsArray, i: Int) =>
flattenArray(o, s"$k[$i]")
case a =>
Json.obj(s"$k[${a._2}]" -> a._1)
})
}
def flattenSeq(s: Seq[Any], b: Seq[JsObject] = Seq()): Seq[JsObject] = {
s.foldLeft[Seq[JsObject]](b){
case (acc, v: JsObject) =>
acc:+v
case (acc, v: Seq[Any]) =>
flattenSeq(v, acc)
}
}
def main(args: Array[String]) {
val appName = "Stream example 1"
val conf = new SparkConf().setAppName(appName).setMaster("local[*]")
//val spark = new SparkContext(conf)
val sc = new SparkContext(conf)
//val sqlContext = new SQLContext(sc)
val sqlContext=new SQLContext(sc);
//val spark=sqlContext.sparkSession
val spark = SparkSession.builder().appName("json Reader")
val df = sqlContext.read.json("C://Users//ashda//Desktop//test.json")
val set = df.select($"user",$"status",$"reason",explode($"dates")).show()
val read = flatten(df)
read.printSchema()
df.show()
}
}
I'm trying to use this code to flatten a higly nested json. For this I created a project and converted it to a maven project. I edited the pom.xml and included the libraries I needed but when I run program it says "Error: Could not find or load main class".
I tried converting the code to sbt project and then run but I get the same error. I tried packaging the code and run through spark-submit which gives me same error. Please let me know what am I missing here. I have tried I could for this.
Thanks
Hard to say, but maybe you have many classes that qualify as main so the build tool does not know which one to choose. Maybe try to clean the project first sbt clean.
Anyway in scala the preferred way to define a main class is to extend the App -trait.
object SomeApp extends App
Then the whole object body will become your main method.
You can also define in your build.sbt the main class. This is necessary if you have many objects that extend the App -trait.
mainClass in (Compile, run) := Some("io.example.SomeApp")
I am answering this question for sbt configurations. I also got the same issues which I resolved recently and made some basic mistakes which I would like you to note :
1. Configure your sbt file
go to build.sbt file and see that the scala version you are using is compatible with spark.As per version 2.4.0 of spark https://spark.apache.org/docs/latest/ ,scala version required is 2.11.x and not 2.12.x . So, even though your IDE (Eclipse/IntelliJ) shows the latest version of scala or the version you downloaded, change it to compatible version. Also, include this line of code
libraryDependencies += "org.scala-lang" % "scala-library" % "2.11.6"
2.11.x is your scala version
2. File Hierarchy
Make sure your Scala file is under /src/main/scala package only
3. Terminal
If your IDE allows you to launch terminal within it, launch it(IntelliJ allows, Not sure of Eclipse or any other) OR Go to terminal and change directory to your project directory
then run :
sbt clean
This will clear any libraries loaded previously or folders created after compilation.
sbt package
This will pack your files into a single jar file under target/scala-/ package
Then submit to spark :
spark-submit target/scala-<version>/<.jar file> --class "<ClassName>(In your case , com.jsonReader.json)" --jars target/scala-<version>/<.jar file> --master local[*]
Note here that -- if specified in a program isnt required here

Scala script in 2.11

I have found an example code for a Scala runtime scripting in answer to Generating a class from string and instantiating it in Scala 2.10, however the code seems to be obsolete for 2.11 - I cannot find any function corresponding to build.setTypeSignature. Even if it worked, the code seems hard to read and follow to me.
How can Scala scripts be compiled and executed in Scala 2.11?
Let us assume I want following:
define several variables (names and values)
compile script
(optional improvement) change variable values
execute script
For simplicity consider following example:
I want to define following variables (programmatically, from the code, not from the script text):
val a = 1
val s = "String"
I want a following script to be compiled and on execution a String value "a is 1, s is String" returned from it:
s"a is $a, s is $s"
How should my functions look like?
def setupVariables() = ???
def compile() = ???
def changeVariables() = ???
def execute() : String = ???
Scala 2.11 adds a JSR-223 scripting engine. It should give you the functionality you are looking for. Just as a reminder, as with all of these sorts of dynamic things, including the example listed in the description above, you will lose type safety. You can see below that the return type is always Object.
Scala REPL Example:
scala> import javax.script.ScriptEngineManager
import javax.script.ScriptEngineManager
scala> val e = new ScriptEngineManager().getEngineByName("scala")
e: javax.script.ScriptEngine = scala.tools.nsc.interpreter.IMain#566776ad
scala> e.put("a", 1)
a: Object = 1
scala> e.put("s", "String")
s: Object = String
scala> e.eval("""s"a is $a, s is $s"""")
res6: Object = a is 1, s is String`
An addition example as an application running under scala 2.11.6:
import javax.script.ScriptEngineManager
object EvalTest{
def main(args: Array[String]){
val e = new ScriptEngineManager().getEngineByName("scala")
e.put("a", 1)
e.put("s", "String")
println(e.eval("""s"a is $a, s is $s""""))
}
}
For this application to work make sure to include the library dependency.
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value

Scala Presentation Compiler

Hi I've been trying to get the presentation compiler to work but I'm getting the following error. Any help regarding this would be appreciated. I've already seen other questions and few projects where it has been implemented but everyone uses the Global.Run which is not being recognized in the REPL. This the code and the error below it. I've installed scala 2.10.3 in windows 8.1
import scala.tools.nsc.{Global,Settings}
import scala.tools.nsc.reporters._
object Test {
def main (args: Array[String]) {
val settings = new Settings;
val global = new Global(settings,new ConsoleReporter(settings));
val compiler = global.Run;
}
}
The error is
Sample.scala:8: error: value Run is not a member of scala.tools.nsc.Global
Try this:
import scala.tools.nsc.{Global,Settings}
import scala.tools.nsc.reporters._
object Test {
def main (args: Array[String]) {
val settings = new Settings
val global = new Global(settings,new ConsoleReporter(settings))
val compiler = new global.Run
}
}
Notice new Run instead of Run. There is no companion object for class Run. Maybe it was there before in earlier scala versions. Checked on Scala v2.10.3. Works in REPL.