I'm trying to play with Scala's PlayFramework and am running into an issue with my build.sbt file. Specifically:
Pattern matching in val statements is not supported
Which is from the obvious:
val env = sys.props.getOrElse("ENV", default = "local")
val (someVal, otherVal) = env match {
case "local" => ("x","a")
case _ => //etc
}
Is there a way to use a match statement in the build.sbt file at all? The error says that it's not supported in val statements. Where is it actually supported?
Edit:
I've tried adding a method to a build.scala object as well, but even when I use plain if statements I still get the same "Pattern matching in val statements is not supported"
Build.scala:
import sbt._
import Keys._
object ExampleBuild extends Build {
def getEnvData(env: String) = {
if(env == "local") {
("c","q")
} else if (env == "other") {
("b","v")
} else {
("x","a")
}
}
}
And updated build.sbt:
val env = sys.props.getOrElse("ENV", default = "local")
val (someVar, otherVar) = ExampleBuild.getEnvData(env)
But to no avail.
The error is not caused by the match statement, but from this:
val (someVar, otherVar) = ...
which is a form of pattern matching (on tuples) not supported by sbt.
Here's a relevant comment from the SbtParser implementation
// Check No val (a,b) = foo or val a,b = foo as these are problematic to range positions and the WHOLE architecture.
You can work around this limitation by using a case class instead of a tuple.
in Build.scala
case class EnvData(someVar: String, otherVar: String)
in build.sbt
val envData = env match {
case "local" => EnvData("x", "a")
case _ => //etc
}
and then use it like envData.someVar, envData.otherVar and so on.
Related
I have a task that will generate an output file based on several input files as well as some sbt settings. It's basically like a compiler where the result is determined by both input files and settings such as compiler flags. Therefore I need to regenerate the result if either the input files or the settings have changed.
This seems like a very common thing to me, and while I do have a solution that works, it seems ugly and long-winded. Is there a better (simpler and/or shorter) way to do it?
import java.nio.file.Path
import sbt.nio.Keys._
import sbt._
import sbt.Keys._
import sbt.nio.file.FileTreeView
import sjsonnew._
class StuffConfig
object StuffConfig {
implicit val fmt: JsonFormat[StuffConfig] = ??? // omitted
}
object StuffPlugin extends AutoPlugin {
object autoImport {
val generateStuff = taskKey[File]("generate Stuff")
val stuffConfig = settingKey[StuffConfig]("stuff config")
val stuffInputFiles = settingKey[Seq[Glob]]("stuff inputs")
val stuffOutput = settingKey[File]("stuff outputs")
}
def doTheThings(cfg: StuffConfig, files: Seq[Path]) = ??? // omitted
override def projectSettings: Seq[Def.Setting[_]] = {
import autoImport._
Seq(
generateStuff / fileInputs := stuffInputFiles.value,
generateStuff := {
val store = streams.value.cacheStoreFactory.make("stuffCache")
val filesChanged = generateStuff.inputFileChanges.hasChanges
val output = stuffOutput.value
Tracked.inputChanged[StuffConfig, File](store) { (configChanged, config) =>
if (configChanged || filesChanged || !output.exists) {
doTheThings(config, FileTreeView.default.list(stuffInputFiles.value).map(_._1))
}
stuffOutput.value
}.apply(stuffConfig.value)
}
)
}
}
I have the following code:
#mymacro #imports
val _1 = { import scala.collection.mutable.ListBuffer }
#mymacro
val _2 = { val yy: ListBuffer[Int] = ListBuffer.empty }
#mymacro is a scala macro that checks if it has been annotated with the #importsannotation. Part of the implementatation is as follows:
case (cc#q"${mods: Modifiers} val $tname: ${tpt: Tree} = ${expr: Tree}") :: Nil =>
if (tname.toString().startsWith("_"))
if (checkImports(mods, expr)) {
q"import scala.collection.mutable.ListBuffer"
}
else
q"{$expr}"
Currently the macro is able to transform the whole val _1 = ... statement to import scala.collection.mutable.ListBuffer (without the {} brackets!) But when the compilation continues, I keep getting the not found: type ListBuffer compilation error. Now I wonder if it is possible to fix this error somehow without having to define the import statement at the top of the file.
I am using the Scala 2.10 macro paradise plugin
I wrote a macro that parses JSON into a matching case class.
def parse(jsonTree: JsValue): BaseType = macro parserImpl
def parserImpl(c: blackbox.Context)(jsonTree: c.Tree) = {
import c.universe._
val q"$json" = jsonTree
val cases = List("X", "Y").map { caseClassName =>
val caseClass = c.parse(caseClassName)
val reader = c.parse(s"JSONHelp.${caseClassName}_reads")
val y = cq"""$caseClassName => (($json \ "config").validate[$caseClass]($reader)).get"""
println(showCode(y))
y
}.toList
val r =
q"""
import play.api.libs.json._
import JSONHelp._
println($json)
($json \ "type").as[String] match { case ..$cases }
"""
println(showCode(r))
r
}
The following is that code it generates (printed by the last println):
{
import play.api.libs.json._;
import JSONHelp._;
println(NodeParser.this.json);
NodeParser.this.json.\("type").as[String] match {
case "X" => NodeParser.this.json.\("config").validate[X](JSONHelp.X_reads).get
case "Y" => NodeParser.this.json.\("config").validate[Y](JSONHelp.Y_reads).get
}
}
The compilation of the subproject containing the macro definition works fine. But when I compile the project(using sbt 0.13.11 and scala 2.11.8) using the macro, I get the following error:
java.lang.NullPointerException
at play.routes.compiler.RoutesCompiler$GeneratedSource$.unapply(RoutesCompiler.scala:37)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:180)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:179)
at scala.Option.collect(Option.scala:250)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:179)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:178)
I'm not a user, but I see it seems to want tree positions with a source file:
val routesPositionMapper: Position => Option[Position] = position => {
position.sourceFile collect {
case GeneratedSource(generatedSource) => {
It's typical to use atPos(pos)(tree). You might assume the incoming tree.pos for synthetic trees.
I'm trying to wrap some blocking calls in Future.The return type is Seq[User] where User is a case class. The following just wouldn't compile with complaints of various overloaded versions being present. Any suggestions? I tried almost all the variations is Source.apply without any luck.
// All I want is Seq[User] => Future[Seq[User]]
def findByFirstName(firstName: String) = {
val users: Seq[User] = userRepository.findByFirstName(firstName)
val sink = Sink.fold[User, User](null)((_, elem) => elem)
val src = Source(users) // doesn't compile
src.runWith(sink)
}
First of all, I assume that you are using version 1.0 of akka-http-experimental since the API may changed from previous release.
The reason why your code does not compile is that the akka.stream.scaladsl.Source$.apply() requires
scala.collection.immutable.Seq instead of scala.collection.mutable.Seq.
Therefore you have to convert from mutable sequence to immutable sequence using to[T] method.
Document: akka.stream.scaladsl.Source
Additionally, as you see the document, Source$.apply() accepts ()=>Iterator[T] so you can also pass ()=>users.iterator as argument.
Since Sink.fold(...) returns the last evaluated expression, you can give an empty Seq() as the first argument, iterate over the users with appending the element to the sequence, and finally get the result.
However, there might be a better solution that can create a Sink which puts each evaluated expression into Seq, but I could not find it.
The following code works.
import akka.actor._
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.{Source,Sink}
import scala.concurrent.ExecutionContext.Implicits.global
case class User(name:String)
object Main extends App{
implicit val system = ActorSystem("MyActorSystem")
implicit val materializer = ActorMaterializer()
val users = Seq(User("alice"),User("bob"),User("charlie"))
val sink = Sink.fold[Seq[User], User](Seq())(
(seq, elem) =>
{println(s"elem => ${elem} \t| seq => ${seq}");seq:+elem})
val src = Source(users.to[scala.collection.immutable.Seq])
// val src = Source(()=>users.iterator) // this also works
val fut = src.runWith(sink) // Future[Seq[User]]
fut.onSuccess({
case x=>{
println(s"result => ${x}")
}
})
}
The output of the code above is
elem => User(alice) | seq => List()
elem => User(bob) | seq => List(User(alice))
elem => User(charlie) | seq => List(User(alice), User(bob))
result => List(User(alice), User(bob), User(charlie))
If you need just Future[Seq[Users]] dont use akka streams but
futures
import scala.concurrent._
import ExecutionContext.Implicits.global
val session = socialNetwork.createSessionFor("user", credentials)
val f: Future[List[Friend]] = Future {
session.getFriends()
}
I am working on a small GUI application written in Scala. There are a few settings that the user will set in the GUI and I want them to persist between program executions. Basically I want a scala.collections.mutable.Map that automatically persists to a file when modified.
This seems like it must be a common problem, but I have been unable to find a lightweight solution. How is this problem typically solved?
I do a lot of this, and I use .properties files (it's idiomatic in Java-land). I keep my config pretty straight-forward by design, though. If you have nested config constructs you might want a different format like YAML (if humans are the main authors) or JSON or XML (if machines are the authors).
Here's some example code for loading props, manipulating as Scala Map, then saving as .properties again:
import java.io._
import java.util._
import scala.collection.JavaConverters._
val f = new File("test.properties")
// test.properties:
// foo=bar
// baz=123
val props = new Properties
// Note: in real code make sure all these streams are
// closed carefully in try/finally
val fis = new InputStreamReader(new FileInputStream(f), "UTF-8")
props.load(fis)
fis.close()
println(props) // {baz=123, foo=bar}
val map = props.asScala // Get to Scala Map via JavaConverters
map("foo") = "42"
map("quux") = "newvalue"
println(map) // Map(baz -> 123, quux -> newvalue, foo -> 42)
println(props) // {baz=123, quux=newvalue, foo=42}
val fos = new OutputStreamWriter(new FileOutputStream(f), "UTF-8")
props.store(fos, "")
fos.close()
Here's an example of using XML and a case class for reading a config. A real class can be nicer than a map. (You could also do what sbt and at least one project do, take the config as Scala source and compile it in; saving it is less automatic. Or as a repl script. I haven't googled, but someone must have done that.)
Here's the simpler code.
This version uses a case class:
case class PluginDescription(name: String, classname: String) {
def toXML: Node = {
<plugin>
<name>{name}</name>
<classname>{classname}</classname>
</plugin>
}
}
object PluginDescription {
def fromXML(xml: Node): PluginDescription = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
def extracted = {
val name = "name"
val claas = "classname"
val vs = Map(name -> getField(name), claas -> getField(claas))
if (vs.values exists (_.isEmpty)) fail()
else PluginDescription(name = vs(name).get, classname = vs(claas).get)
}
def fail() = throw new RuntimeException("Bad plugin descriptor.")
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extracted
case _ => fail()
}
}
}
This code reflectively calls the apply of a case class. The use case is that fields missing from config can be supplied by default args. No type conversions here. E.g., case class Config(foo: String = "bar").
// isn't it easier to write a quick loop to reflect the field names?
import scala.reflect.runtime.{currentMirror => cm, universe => ru}
import ru._
def fromXML(xml: Node): Option[PluginDescription] = {
def extract[A]()(implicit tt: TypeTag[A]): Option[A] = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
val apply = ru.newTermName("apply")
val module = ru.typeOf[A].typeSymbol.companionSymbol.asModule
val ts = module.moduleClass.typeSignature
val m = (ts member apply).asMethod
val im = cm reflect (cm reflectModule module).instance
val mm = im reflectMethod m
def getDefault(i: Int): Option[Any] = {
val n = ru.newTermName("apply$default$" + (i+1))
val m = ts member n
if (m == NoSymbol) None
else Some((im reflectMethod m.asMethod)())
}
def extractArgs(pss: List[List[Symbol]]): List[Option[Any]] =
pss.flatten.zipWithIndex map (p => getField(p._1.name.encoded) orElse getDefault(p._2))
val args = extractArgs(m.paramss)
if (args exists (!_.isDefined)) None
else Some(mm(args.flatten: _*).asInstanceOf[A])
}
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extract[PluginDescription]()
case _ => None
}
}
XML has loadFile and save, it's too bad there seems to be no one-liner for Properties.
$ scala
Welcome to Scala version 2.10.0-RC5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_06).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import reflect.io._
import reflect.io._
scala> import java.util._
import java.util._
scala> import java.io.{StringReader, File=>JFile}
import java.io.{StringReader, File=>JFile}
scala> import scala.collection.JavaConverters._
import scala.collection.JavaConverters._
scala> val p = new Properties
p: java.util.Properties = {}
scala> p load new StringReader(
| (new File(new JFile("t.properties"))).slurp)
scala> p.asScala
res2: scala.collection.mutable.Map[String,String] = Map(foo -> bar)
As it all boils down to serializing a map / object to a file, your choices are:
classic serialization to Bytecode
serialization to XML
serialization to JSON (easy using Jackson, or Lift-JSON)
use of a properties file (ugly, no utf-8 support)
serialization to a proprietary format (ugly, why reinvent the wheel)
I suggest to convert Map to Properties and vice versa. "*.properties" files are standard for storing configuration in Java world, why not use it for Scala?
The common way are *. properties, *.xml, since scala supports xml natively, so it would be easier using xml config then in java.