I am trying to generate some boilerplate with SBT (tool which is totally new to me). I am using shapeless sbt files as my main reference for the task. I have seen that this project uses code generation from scratch, but my case is slightly different, since I would like to generate some classes from another ones. I pretend to use the new Scala 2.10.0-M4 reflection capabilities for doing so. What basic configuration is needed to have reflection available from a SBT build?
By now, the sbt is unable to find the scala.reflect.runtime.universe package, and I do not know if the problem comes either from the new Scala jar division or from a bad configuration. Besides, my sbt about says:
[info] This is sbt 0.13.0-20120530-052139
[info] The current project is {file:/home/jlg/sandbox/abc/}abc
[info] The current project is built against Scala 2.10.0-SNAPSHOT
[info]
[info] sbt, sbt plugins, and build definitions are using Scala 2.9.2
By the way, does anybody know other projects using SBT to generate source code?
Current SBT releases are based on Scala 2.9, and source code generation runs together with SBT with the same libraries. There are basically two choices:
be extremely bleeding-edge: get an SBT release running on Scala 2.10 (not even the 0.13 branch does), or waiting for it. The biggest problem is not just that you'd have to recompile SBT yourself, it's recompiling every single SBT plugin you'll need for Scala 2.10. In the long-term, this is maybe the best strategy to do what you ask, but it might be a lot of effort for now. However, beware that you cannot use reflection on your compiled code without evil tricks, since code generation is supposed to happen before compilation. If you need to do that, consider instead generating code at compile-time within the program using macros. This excludes SBT and is much more standard, but I'm not sure if you can generate complete classes in this release (this is I think planned for the future).
go with the old: stick with Scala 2.9 and use scalap's capabilities (ScalaSigParser) for compile-time reflection. The problem is that the API is different (not sure how deeply) and not really supported for public use, although various people have been using it for ages. For a project I'm running, a colleague implemented approach and I integrated it within SBT for my project (https://github.com/ps-mr/LinqOnSteroids/); on top of that, I use Scalate to write the templates to use for code generation, which is quite powerful.
See in particular build.sbt, which invokes
project/Generator.scala and project/src/main/scala/ivm/generation/ScalaSigHelpers.scala (some non-fully-generic wrappers for ScalaSigParser). Scalate Templates for generated code are in
src/main/resources, the most relevant here is src/main/resources/WrappedClassInlined.ssp.
Even more stuff is involved, I fear you'll pratically need a checkout and playing with it to see what it does exactly—but feel free to ask questions.
Please note that the code is protected by a BSD license, so you need to keep the original copyright if you copy the code.
Note: all the links (except the license) are to the current HEAD for stability, so that they won't disappear so easily even if the files are moved/removed in future versions.
If you're using 2.10.0-SNAPSHOT, then you should go for scala.reflect.runtime.universe. Take a look at http://dcsobral.blogspot.ch/2012/07/json-serialization-with-reflection-in.html for more information.
Related
So Im learning Scala and using the Intellj IDE to make my projects with. When I click on New Project and then Scala I get the choice of
Scala
SBT
when i hoover the mouse over them I get additional info
Sample module with attached Scala SDK
SBT based Scala Project
Now I have played around with SBT before I downloaded Intellij and used it to compile and run some Scala code, so i kind of know what it is.
But I just don't know which one I should be choosing 1 or 2 and why someone would choose 1 over 2 or vise-versa?
Use SBT project . This will lead to Intellij doing the autobuild using SBT wityhout you having to rework the build each time.
The first time Intellij runs the sbt it will take some time to set itself up but eventually it will be far more rewarding.
Also as mentioned by Boris for portability you would want a standard build/compile tool.
I think there is not the solution to your problem. If you just want to try out Scala and the Scala SDK, the first choice is fine because you don't have any needs for an automated build. In my opinion is this your choice if you want to play around a bit without any overhead.
If you want to do a more real project I suggest to use sbt because it will build your project and manage your dependencies. This makes your project more flexible, easy to build for somebody else.
Iterating on my compiler plugin's code, I am publishing my compiler plugin to my local ivy repository after each compilation of it (via publishLocal), and then running my other project where a dependency upon this plugin is defined via addCompilerPlugin. Is there a more concise practice for developing a compiler plugin?
Of course, I could aggregate the two into a multi-project build definition. But it might be nice to learn of more lightweight practices for iterating plugin code...
Could I in the very least depend on the compiler plugin without turning it into a library for that? from the syntax permitted by addCompilerPlugin it looks like a library must (?) be created and added, rather than affording a dependency on mere class files.
Look at what I do in the scapegoat plugin, where I create a 'test' compiler. I use this to compile code snippets in the form of unit tests.
This way you can write code and run your tests, as you would normally, without needing to publish externally.
https://github.com/sksamuel/scalac-scapegoat-plugin/blob/master/src/test/scala/com/sksamuel/scapegoat/PluginRunner.scala
scala-2.11 folder appeared after recent update of IDEA and Scala plugin.
What should it be used for?
Usually such directories are used for binary version-dependent code. For example, macros in 2.10 are not source-compatible with macros in 2.11, so if you're building your project for different binary versions and you're using macros, it makes sense to put code which is only valid for the specific version in different source roots. SBT then will use the appropriate directory when compiling for 2.10 or 2.11.
If you're using SBT, though, you would need to set such thing up manually in the build definition. If you're not using SBT, then probably IDEA plugin was updated to handle such things by itself.
I occasionally play with Scala forks and sometimes need to debug these forks on SBT projects. In general, scalaHome works great, but there are a few things that I'd like to find better ways to achieve.
1) Is it possible to have SBT pick up custom scalac class files produced by the ant quick build rather than jar files emitted by the ant pack build? The latter implies 5-10 seconds of additional delay per build, so it'd be great to avoid it.
2) Even in big projects, problems exhibited by scalac usually manifest themselves when compiling single files. Is there a way to tell sbt to neglect its change tracking heuristics and recompile just a single file? What I would particularly like to prevent is recompilation of the whole world when I recompile scalaHome or change scalac flags.
3) Would it be possible to have sbt hot reload scalac classes coming from scalaHome, when scalaHome gets recompiled? Currently I have to shutdown and restart sbt to apply the changes.
1) No, this would make sbt depend on the details of the Scala build. If Scala were built with sbt, you might be able to depend on Scala as a source dependency or at least this could probably be supported without too many changes.
2) No, see https://github.com/sbt/sbt/issues/604
3) sbt 0.13 should check the last modified times of the jars coming from scalaHome and use a new class loader. It is a bug if it does not.
I have recently completed the Scala course on Coursera, and since then I have been looking forward to getting my hands dirty with Scala again. I have written code for some years but I neither educated to be nor work as a programmer, so it took me a while to get a good opportunity but now that I have some time to invest and a good project to work on it's time...
Except I can't seem to get things set up properly, which I find really frustrating. I have OpenJDK 1.7.0_25 running on my Linux machine. I have downloaded and installed the Bundle Scala IDE build for Eclipse (just like we used in the course). And I got ScalaTest both as a jar file and the Eclipse plug-in.
I have a simple project (so far) and no matter what I do I can't seem to get my builds and tests in order. First off how exactly am I supposed to set up my project so that my classes and tests are actually run properly? All the assignments we got were projects that had the same structure, so do I have to have:
project
|--src
|--main
|--scala
|--test
|--scala
structure? If so why is it not the default way the project is setup when I create a new project? Do I create these folders manually, as packages or as source folders? The whole thing gets pretty murky..
I should mention that I tried to "Mavenize" the project using the contextual menu in Eclipse, added my ScalaTest dependency. The first thing that happens is that I get compile errors, at every point of dependency in my code. So clearly the library is not visible, in other words Maven does not seem to be doing much of management. I thought the whole point of Maven was to get and maintain dependencies as the project evolves. I concluded that I do not fully understand the way Maven works and thus I eventually gave up on Maven, once again, and went back to doing things manually.
Secondly, I can't seem to run my tests; the Run As... menu item does not include ScalaTest as it's mentioned in the documentation of ScalaTest Eclipse Plug-in. I have double checked that the plugin is installed. If I instead try to run using JUnitRunner then my tests are not recognized as valid tests. I have JUnit and ScalaTest on my build path, so it's got to be something else.
I suppose my overarching question is as follows:
given the Scala IDE build of Eclipse and ScalaTest, just exactly how am I supposed to set up my project (in Eclipse) so that I can just focus on writing my code and testing it, and hopefully not have any other headaches?
I work alone, and this project is not a product I need to deliver to some client. In other words I do not need to adhere to strict professionalism here. Honestly I just want to be able to code, get better acquainted with Scala and hopefully build a small data analysis tool that I will be using from time to time.
Thanks in advance!
Try using the sbt eclipse plugin:
https://github.com/typesafehub/sbteclipse
This is of course assumes that you use sbt as you build tool. If you don't at the moment you can find instructions on installation and usage here: http://www.scala-sbt.org/
Personally I've been using typesafe giter8 template (https://github.com/typesafehub/scala-sbt.g8) to setup my Scala projects, and then I use the sbt plugin mentioned above to generate eclipse project files.
Scala is somewhat Maven-based (sometimes implicitly), that's why you use that structure.
The easiest way I think is to create a simple Sbt/Maven POM and create the Eclipse project configurations (like with sbt eclipse). There you can set the dependencies (like the actual version of JUnit, Scalatest to use), so you can use the ScalaTest plugin easily.
In case of other issues, feel free to ask at the ScalaTest mailing list, Chee Seng and Bill Venners can help you a lot there.
The Scala IDE website has a full documentation on how to run unit testing frameworks with the IDE, have a look ! If you find missing elements, the bug tracker of the scala-IDE project is here.