I am trying to convert the tutorial here http://atnos-org.github.io/eff/org.atnos.site.Introduction.html into a running Scala program inside IntelliJ-IDEA. The code runs in the command line REPL, but not in the IDE.
I have simply copied and pasted all the code into one file, added an object Intro extends App
Here is the code:
class Tutorial{
}
object Intro extends App {
import cats._
import cats.data._
import org.atnos.eff._
type ReaderInt[A] = Reader[Int, A]
type WriterString[A] = Writer[String, A]
type Stack = Fx.fx3[WriterString, ReaderInt, Eval]
import org.atnos.eff.all._
import org.atnos.eff.syntax.all._
// useful type aliases showing that the ReaderInt and the WriterString effects are "members" of R
// note that R could have more effects
type _readerInt[R] = ReaderInt |= R
type _writerString[R] = WriterString |= R
def program[R: _readerInt : _writerString : _eval]: Eff[R, Int] = for {
// get the configuration
n <- ask[R, Int]
// log the current configuration value
_ <- tell("the required power is " + n)
// compute the nth power of 2
a <- delay(math.pow(2, n.toDouble).toInt)
// log the result
_ <- tell("the result is " + a)
} yield a
println(program[Stack].runReader(6).runWriter.runEval.run)
}
The compiler error is Cannot resolve symbol run in the last line.
Here is my build.sbt file, following the instructions for the library:
name := "Tutorial"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.typelevel" %% "cats" % "0.9.0"
libraryDependencies += "org.atnos" %% "eff" % "3.1.0"
// to write types like Reader[String, ?]
addCompilerPlugin("org.spire-math" %% "kind-projector" % "0.9.3")
// to get types like Reader[String, ?] (with more than one type parameter) correctly inferred
// this plugin is not necessary with Scala 2.12
addCompilerPlugin("com.milessabin" % "si2712fix-plugin_2.11.8" % "1.2.0")
Edmund Noble answered my question here: https://github.com/atnos-org/eff/issues/80#issuecomment-287667353
The reason why IntelliJ cannot figure out your code in particular looks to be because IntelliJ is not capable of simulating implicit-directed type inference; the Member implicits have a type member, Out, inside them which represents the remaining effect stack. If the IDE cannot figure it out, it subs in a fresh type variable and thus the run ops constructor cannot be called because the inferred type according to IntelliJ is Eff[m.Out, A] and not Eff[NoFx, A].
FIX: I was able to separately compile the file and then run it, even though the error is still highlighted in the IDE.
Unless there is some feature in IntelliJ IDEA that I haven't enabled, this looks like a limitation and/or bug in IntelliJ IDEA .
Related
I'm trying to follow this example https://github.com/sgodbillon/reactivemongo-demo-app and then implement this in my project.
but I found many difficulties, because I'm using "org.reactivemongo" %% "play2-reactivemongo" % "0.11.9" not the same version used in the tutorial, I solved some problems using some imports but for this one I'm really blocked:
type arguments [play.api.libs.json.JsObject,Articles.this.JSONReadFile] do not conform to method find's type parameter bounds [S,T <: reactivemongo.api.gridfs.ReadFile[reactivemongo.play.json.JSONSerializationPack.type, _]](line 97)
result <- maybeArticle.map { article =>
97 gridFS.find[JsObject, JSONReadFile](
98 Json.obj("article" -> article.id.get)).collect[List]().map { files =>
99 val filesWithId = files.map { file =>
Any help !!
the sbt task documentation shows an example of usage dependencies. It is very simple, artificial but it works! So I reproduced it in my project/scala.build without problem.
Note that I choose global scope to make tasks available for any project and any configuration
import sbt._
import Keys._
object TestBuild extends Build {
lazy val sampleTask = taskKey[Int]("A sample task")
lazy val intTask = taskKey[Int]("An int task")
override lazy val settings = super.settings ++ Seq(
intTask := 1 + 2 ,
sampleTask := intTask.value + 1
)
}
Now I'm trying to do something useful and enrich existing sbt key definitions with task that collects compiled class names
import sbt._
import Keys._
import sbt.inc.Analysis
import xsbti.api.ClassLike
import xsbt.api.Discovery.{isConcrete, isPublic}
object TestBuild extends Build {
lazy val debugAPIs = taskKey[List[String]]("list of all top-level definitions")
override lazy val settings = super.settings ++ Seq(
debugAPIs := getAllTop( compile.value )
)
private def getAllTop(analysis : Analysis) : List[String] =
Tests.allDefs(analysis).toList collect {
case c : ClassLike if isConcrete(c) && isPublic(c) => c.name
}
}
Now I get error from sbt:
Reference to undefined setting:
{.}/*:compile from {.}/*:debugAPIs (/home/sbt/project/build.scala:11)
So I have two questions:
How should I define debugAPIs properly so that it task would be available for all projects and all configurations?
How can I reproduce this error in synthetic configuration?
I'm more interested in the second question actually. I look for deep understanding of how sbt works because I'd like to write a plugin for it.
The problem is that you try to access a key value without a proper Scope.
The documentation gives us some hint here.
By default, all the keys associated with compiling, packaging, and
running are scoped to a configuration and therefore may work
differently in each configuration. The most obvious examples are the
task keys compile, package, and run; but all the keys which affect
those keys (such as source-directories or scalac-options or
full-classpath) are also scoped to the configuration.
Let's first focus on a very simple example, which maybe doesn't make much sense, but illustrates the problem. Lets assume that you want to redefine the compile task to itself.
override lazy val settings = super.settings ++ Seq (
compile := { compile.value }
)
Running this in SBT will give you an error, which is more or less like this
[error] {.}/*:compile from {.}/*:compile (/tmp/q-23723818/project/Build.scala:12)
[error] Did you mean compile:compile ?
We didn't specify the scope so SBT picked some defaults. The project was set to ThisBuild (meaning no specific project) and configuration set to Global. The setting was undefined in that context. However it's important to understand that a key is not a setting. The key can exist without scope, but the value of a key is attached to a scope. Note also that, if SBT won't find the value in the requested scope it can delegate to other scopes, but this is another topic.
How can we check this? Turns out that quite simple. Let's ignore the error, and let the SBT start.
If you type inspect compile you'll see that the inspect will look in compile:compile, where the value is defined. We can force it to look in a specific scope, e.g. inspect {.}/*:compile, will look in scope that gave us the error.
> inspect {.}/*:compile
[info] No entry for key.
Indeed it's undefined.
How to solve the issue? You have to give SBT the scope you're looking for. Naively you could try to add a configuration scope.
// this will NOT work
override lazy val settings = super.settings ++ Seq (
compile in Compile := { (compile in Compile).value }
)
Well but there is no global compile, there is only compile per project. You could overcome the issue by not overriding global settings, but the settings for a specific project, and specifying Compile configuration there.
lazy val root = project.in(file(".")).settings(Seq(
compile in Compile := {(compile in Compile).value}
): _*)
This would work,but what if you want to get the compile value regardless of where it is? This is where ScopeFilter comes in handy. Back to your original example. I assume you want to get compile's Analysis object from all the projects.
import sbt._
import Keys._
import sbt.inc.Analysis
import xsbti.api.ClassLike
import xsbt.api.Discovery.{isConcrete, isPublic}
object TestBuild extends Build {
val debugAPIs = taskKey[Seq[String]]("list of all top-level definitions")
val compileInAnyProject = ScopeFilter(inAnyProject, inConfigurations(Compile))
override lazy val settings = super.settings ++ Seq(
debugAPIs := {
getAllTop(compile.all(compileInAnyProject).value)
}
)
private def getAllTop(analyses : Seq[Analysis]) : Seq[String] =
analyses.flatMap { analysis =>
Tests.allDefs(analysis) collect { case c : ClassLike if isConcrete(c) && isPublic(c) => c.name }
}
}
What we created is a ScopeFilter filtering for any project, and in that projects for Compile configuration. Then we looked for all compile values.
You can configure the ScopeFilter to match your needs, and only filter for specific projects/configurations or even tasks. But the key to understand the problem is to remember that in SBT settings are always scoped.
Edit
You have asked how it comes that the compile is not defined globally but is available to every project. This is because there is Defaults.defaultSettings which define it. And each project include it. If you removed super.settings from your Build definition you'd see that among others compile is undefined.
And as if you should do it this way. Well overriding settings in your plugin is in general discouraged in Plugin Best Practices. However I recommend that you read it, together with Plugins chapter. It should give you an idea of how to proceed.
You can also get multiple values from multiple scopes by defining new task returning them. For example to get analyses with a project, you could use following piece of code.
object TestBuild extends Build {
val debugAPIs = taskKey[Seq[(String, String)]]("list of all top-level definitions")
val compileInAnyProject = ScopeFilter(inAnyProject, inConfigurations(Compile))
override lazy val settings = super.settings ++ Seq(
debugAPIs := {
getAllTop(analysisWithProject.all(compileInAnyProject).value)
}
)
lazy val analysisWithProject = Def.task { (thisProject.value, compile.value) }
private def getAllTop(analyses : Seq[(ResolvedProject, Analysis)]) : Seq[(String, String)] =
analyses.flatMap { case (project, analysis) =>
Tests.allDefs(analysis) collect { case c : ClassLike if isConcrete(c) && isPublic(c) => (project.id, c.name) }
}
}
How can you redefine a key in SBT (as opposed to extend or define)?
I currently have the following in my build script (project/build.scala):
fullClasspath in Runtime <<= (fullClasspath in Runtime, classDirectory in Compile) map { (cp, classes) => (cp.files map {
f: File =>
if (f.getName == classes.getName) {
val result = new File(f.getParent + File.separator + "transformed-" + f.getName)
if (result.exists) result else f
} else f
}).classpath }
It extends the classpath in Runtime by adding, for each directory in Compile, a new directory with the same name but with transformed- prepended to the front.
(If you are wondering why, I have a plugin which performs program transformation on the bytecode after compilation but before packaging, and selective recompilation gets very confused if you overwrite the original files.)
My problem is the following: This extends the original key, and therefore the classpath contains the original directories from Compile, plus the renamed copies, but I only want the renamed ones from Compile.
I tried to do something along the lines of
fullClasspath in Runtime := ...
but I don't know what to put on the right-hand side.
I've marked the answer since it lead me directly to the solution, but my final solution was to modify the above code snippet to the following
fullClasspath in Runtime := (fullClasspath in Runtime).value.files.map {
f: File =>
if (f.getName == (classDirectory in Compile).value.getName) {
val result = new File(f.getParent + File.separator + "transformed-" + f.getName)
if (result.exists) result else f
} else f
}.classpath
which does exactly what I wanted, and is slightly better style.
Here's a little experiment I did just now at the sbt command line showing that it's definitely possible to remove something unwanted from fullClasspath:
% sbt
> show runtime:fullClasspath
[info] List(Attributed(.../target/classes),
Attributed(.../jars/scala-library-2.10.4.jar),
Attributed(.../jars/asm-all-3.3.1.jar))
> set fullClasspath in Runtime := (fullClasspath in Runtime).value.files.filterNot(_.getName.containsSlice("asm")).classpath
> show runtime:fullClasspath
[info] List(Attributed(.../target/classes),
Attributed(.../jars/scala-library-2.10.4.jar))
Voila — the "asm-all-3.3.1" entry is gone.
Note that this isn't about <<= vs :=. The latter is just macro-based sugar for the former. The experiment turns out the same if I avoid := and do this instead:
set fullClasspath in Runtime <<= (fullClasspath in Runtime) map
{_.files.filterNot(_.getName.containsSlice("asm")).classpath}
Quasiquotes simplify many things when writing macros in Scala. However I noticed that macros containing quasiquotes could be recompiled every time a compilation in SBT is triggered, even though neither the macro implementation nor any of it's call sites have changed and need recompilation.
This doesn't seem to happen, if the code in quasiquotes is fairly simple, it seems to happen only if there's a dependency on another class. I noticed that rewriting everything with "reify" seems to solve the recompilation problem but I don't manage to rewrite the last part without quasiquotes...
My macro avoids reflection on startup by creating wrapper functions during compilation.
I have the following classes:
object ExportedFunction {
def apply[R: Manifest](f: Function0[R], fd: FunctionDescription): ExportedExcelFunction = new ExcelFunction0[R] {
def apply: R = f()
val functionDescription = fd
}
def apply[T1: Manifest, R: Manifest](f: Function1[T1, R], fd: FunctionDescription): ExportedExcelFunction = new ExcelFunction1[T1, R] {
def apply(t1: T1): R = f(t1)
val functionDescription = fd
}
... and so on... until Function17...
}
I then analyze an object and export any member function using the described interface like so:
def export(registrar: FunctionRegistrar,
root: Object,
<...more args...>) = macro exportImpl
def exportImpl(c: Context)(registrar: c.Expr[FunctionRegistrar],
root: c.Expr[Object],
<...>): c.Expr[Any] = {
import c.universe._
<... the following is simplified ...>
root.typeSignature.members.flatMap {
case x if x.isMethod =>
val method = x.asMethod
val callee = c.Expr(method))
val desc = q"""FunctionDescription(<...result from reflective lookup...>)"""
val export = q"ExportedFunction($callee _, $desc)"
q"$registrar.+=({$export})"
I can rewrite the first and last line with reify but I don't manage to rewrite the second line, my best shot is with quasiquotes:
val export = reify {
...
ExportedFunction(c.Expr(q"""$callee _"""), desc)
...
}.tree
But this results in:
overloaded method value apply with alternatives... cannot be applied to (c.Expr[Nothing], c.universe.Expr[FunctionDescription])
I think the compiler is missing the implicits, or maybe this code will only work for a function with a fixed number of arguments, since it needs to know at macro compile time how many arguments the method has? However it works if everything is written using quasiquotes...
From the description of the SBT problem I can assume that you're using macro paradise for 2.10.x and are facing https://github.com/scalamacros/paradise/issues/11. I was planning to address this issue this week, so the fix should arrive really soon. In the meanwhile you could use a workaround described on the issue page.
As for the reify problem, not all quasiquotes can be rewritten using reify. Limitations such as the one you have faced here were a very strong motivator towards shifting our focus to quasiquotes in Scala 2.11.
For the record, these SBT settings (upgrade to newer version) fixed it:
...
settings = Seq(
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value,
libraryDependencies += "org.scalamacros" % "quasiquotes" % "2.0.0-M3" cross CrossVersion.full,
autoCompilerPlugins := true,
addCompilerPlugin("org.scalamacros" % "paradise" % "2.0.0-M3" cross CrossVersion.full)
)
I have the following example build.sbt that uses sbt-assembly. (My assembly.sbt and project/assembly.sbt are set up as described in the readme.)
import AssemblyKeys._
organization := "com.example"
name := "hello-sbt"
version := "1.0"
scalaVersion := "2.10.3"
val hello = taskKey[Unit]("Prints hello")
hello := println(s"hello, ${assembly.value.getName}")
val hey = taskKey[Unit]("Prints hey")
hey <<= assembly map { (asm) => println(s"hey, ${asm.getName}") }
//val hi = taskKey[Unit]("Prints hi")
//hi <<= assembly { (asm) => println(s"hi, $asm") }
Both hello and hey are functionally equivalent, and when I run either task from sbt, they run assembly first and print a message with the same filename. Is there a meaningful difference between the two? (It seems like the definition of hello is "slightly magical", since the dependency on assembly is only implied there, not explicit.)
Lastly, I'm trying to understand why hey needs the map call. Obviously it results in a different object getting passed into asm, but I'm not quite sure how to fix this type error in the definition of hi:
sbt-hello/build.sbt:21: error: type mismatch;
found : Unit
required: sbt.Task[Unit]
hi <<= assembly { (asm) => println(s"hi, $asm") }
^
[error] Type error in expression
It looks like assembly here is a [sbt.TaskKey[java.io.File]][2] but I don't see a map method defined there, so I can't quite figure out what's happening in the type of hey above.
sbt 0.12 syntax vs sbt 0.13 syntax
Is there a meaningful difference between the two?
By meaningful difference, if you mean semantic difference as in observable difference in the behavior of the compiled code, they are the same.
If you mean, any intended differences in code, it's about the style difference between sbt 0.12 syntax sbt 0.13 syntax. Conceptually, I think sbt 0.13 syntax makes it easier to learn and code since you deal with T instead of Initialize[T] directly. Using macro, sbt 0.13 expands x.value into sbt 0.12 equivalent.
anatomy of <<=
I'm trying to understand why hey needs the map call.
That's actually one of the difference macro now is able to handle automatically.
To understand why map is needed in sbt 0.12 style, you need to understand the type of sbt DSL expression, which is Setting[_]. As Getting Started guide puts it:
Instead, the build definition creates a huge list of objects with type Setting[T] where T is the type of the value in the map. A Setting describes a transformation to the map, such as adding a new key-value pair or appending to an existing value.
For tasks, the type of DSL expression is Setting[Task[T]]. To turn a setting key into Setting[T], or to turn a task key into Setting[Task[T]], you use <<= method defined on respective keys. This is implemented in Structure.scala (sbt 0.12 code base has simpler implementation of <<= so I'll be using that as the reference.):
sealed trait SettingKey[T] extends ScopedTaskable[T] with KeyedInitialize[T] with Scoped.ScopingSetting[SettingKey[T]] with Scoped.DefinableSetting[T] with Scoped.ListSetting[T, Id] { ... }
sealed trait TaskKey[T] extends ScopedTaskable[T] with KeyedInitialize[Task[T]] with Scoped.ScopingSetting[TaskKey[T]] with Scoped.ListSetting[T, Task] with Scoped.DefinableTask[T] { ... }
object Scoped {
sealed trait DefinableSetting[T] {
final def <<= (app: Initialize[T]): Setting[T] = setting(scopedKey, app)
...
}
sealed trait DefinableTask[T] { self: TaskKey[T] =>
def <<= (app: Initialize[Task[T]]): Setting[Task[T]] = Project.setting(scopedKey, app)
...
}
}
Note the types of app parameters. Setting key's <<= takes Initialize[T] whereas the task key's <<= takes Initialize[Task[T]]. In other words, depending on the the type of lhs of an <<= expression the type of rhs changes. This requires sbt 0.12 users to be aware of the setting/task difference in the keys.
Suppose you have a setting key like description on the lhs, and suppose you want to depend on name setting and create a description. To create a setting dependency expression you use apply:
description <<= name { n => n + " is good." }
apply for a single key is implemented in Settings.scala:
sealed trait Keyed[S, T] extends Initialize[T]
{
def transform: S => T
final def apply[Z](g: T => Z): Initialize[Z] = new GetValue(scopedKey, g compose transform)
}
trait KeyedInitialize[T] extends Keyed[T, T] {
final val transform = idFun[T]
}
Next, instead of description, suppose you want to create a setting for jarName in assembly. This is a task key, so rhs of <<= takes Initialize[Task[T]], so apply is not good. This is where map comes in:
jarName in assembly <<= name map { n => n + ".jar" }
This is implemented in Structure.scala as well:
final class RichInitialize[S](init: Initialize[S]) {
def map[T](f: S => T): Initialize[Task[T]] = init(s => mktask(f(s)) )
}
Because a setting key extends KeyedInitialize[T], which is Initialize[T], and because there's an implicit conversion from Initialize[T] to RichInitialize[T] the above is available to name. This is an odd way of defining map since maps usually preserves the structure.
It might make more sense, if you see similar enrichment class for task keys:
final class RichInitializeTask[S](i: Initialize[Task[S]]) extends RichInitTaskBase[S, Task] {...}
sealed abstract class RichInitTaskBase[S, R[_]] {
def map[T](f: S => T): Initialize[R[T]] = mapR(f compose successM)
}
So for tasks, map maps from a task of type S to T. For settings, we can think of it as: map is not defined on a setting, so it implicitly converts itself to a task and maps that. In any case, this let's sbt 0.12 users to think: Use apply for settings, map for tasks. Note that apply ever goes away for task keys as they extend Keyed[Task[T], Task[T]]. This should explain:
sbt-hello/build.sbt:21: error: type mismatch;
found : Unit
required: sbt.Task[Unit]
Then there's the tuple issue. So far I've discussed dependencies to a single setting. If you want to depend on more, sbt implicitly adds apply and map to Tuple2..N to handle it. Now it's expanded to 15, but it used to be up till only Tuple9. Seeing from a new user's point of view, the idea of invoking map on a Tuple9 of settings so it generates a task-like Initialize[Task[T]] would appear alien. Without changing the underlying mechanism, sbt 0.13 provides much cleaner surface to get started.