Maven dependency for GeneratedMockFactory - scala

what should be the correct maven dependencies to declare if i want to use org.scalamock.generated.GeneratedMockFactory?
I have a scala project which depends on scalatest_2.10 version 2.0.M5B and scalamock-scalatest version 3.0.1 and it looks like the org.scalamock.generated is in neither of them.
kind regards
marco

org.scalamock.generated.GeneratedMockFactory is a trait that is generated by the Scalamock compiler plugin for Scalamock 2 (for Scala 2.9 or older). In scalamock 3 (for Scala 2.10/2.11), the use of the compiler plugin is replaced by macros, so that Scalamock now supports the following two types of mocks :
Macro mocks, using org.scalamock.scalatest.MockFactory
Proxy mocks, using org.scalamock.scalatest.proxy.MockFactory
Please note that macro mocks may fail (at compilation) when trying to mock some complex traits, but they are fully type-checked and have nicer syntax - so it's a good idea to use macro mocks as much as possible, and fall back to proxy mocks when they don't work, according to Scalamock's author. He also has a nice step-by-step guide to using Scalamock 3 (with macro mocks) here.

Related

Compilation / Code Generation of External Scala DSL

My understanding is that it is quite simple to create & parse an external DSL in Scala (e.g. representing rules). Is my assumption correct that the DSL can only be interpreted during runtime but does not support code generation (like ANTLR) for archiving better performance ?
EDIT: To be more precise, my question is if I could achieve this (create an external domain specific language and generate java/scala code) with built-in Scala tools/libraries (e.g. http://www.artima.com/pins1ed/combinator-parsing.html). Not writing a whole parser / code generator completely by yourself in scala. It's also clear that you can achieve this with third-party tools but you have to learn additional stuff and have additional dependencies. I'm new in the area of implementing DSLs, so I have no gutfeeling so far when to use external tools like ANTLR and what you can (with a reasonable effort) do with Scala on-board stuff.
Is my assumption correct that the DSL can only be interpreted during runtime but does not support code generation (like ANTLR) for archiving better performance ?
No, this is wrong. It is possible to write a compiler in Scala, after all, Scala is Turing-complete (i.e. you can write anything), and you don't even need Turing-completeness for a compiler.
Some examples of compilers written in Scala include
the Scala compiler itself (in all its variations, Scala-JVM, Scala.js, Scala-native, Scala-virtualized, Typelevel Scala, the abandoned Scala.NET, …)
the Dotty compiler
Scalisp
Scalispa
… and many others …

Multiple scala versions in the same project

Apologies if this is a duplicate, I didn't hit on the magic keyword while searching.
I have a project where I pull in various dependencies. One of them (jooq) depends on scala 2.10, whereas my application depends on scala 2.11.x.
Although everything "works", I would like to understand better what are the runtime implications of doing something like this? How will the JVM resolve the different dependencies, and what type of overhead could I be looking at?
I am trying to determine if it's worthwhile to fork jooq, and compile it against 2.11 (assuming it will compile and work under 2.11).
Scala is not binary compatible between major versions (2.10 to 2.11 for example). This means that there are no guarantees that a library that is compiled for Scala 2.10 will work in a project using 2.11. You might be lucky enough that it works, but I would definitely not depend on that luck for any important codebase.
This is the reason why Scala libraries always has got the library version in their name and why SBT has got special syntax for dependencies to get the right library build for the Scala version used.
On a side note Martin Odersky (Scalas "father") has been proposing a solution to this problem during the year, storing an intermediate representation along with the byte code to allow automagical recompilation to a newer Scala version.
You have the possible danger of runtime exceptions.
As Scala 2.10 and 2.11 are quite similiar the danger is not as big as it has been with 2.9 to 2.10 or 2.8 to 2.9 but it still is there and if you want to do something that is meant to be prodcution code, you definitly should try to raise jooq to 2.11.

Testing Scala compile-time behavior with sbt

Testing runtime behavior is very well documented but with the advent of powerful type systems and macro system one might be interested in testing compile-time behavior.
For instance when writing a library that provides compile-time guarantees. Say I'm building a set of test matchers and I want to make sure a matcher is as type-safe as I claim it to be.
List(1,2) must beEqualTo(Set(1,2)) // should fail at compile-time
I can see in the scala compiler project that most of the tests are functional tests where the compiler output is asserted by comparing it with a reference file.
Is there a convention for such tests? An SBT plugin?
Thanks

whether it is possible find (in a runtime) all subclasses (which mixing some trait) using scala 2.10

i need find all subclasses which mixing some trait (i won't do this in a runtime). I know tool written in scala (ClassUtil) but this tool is slow. Also I know one tool written in java (fasters than ClassUtil), but if I have choice I wouldn't rather using external libraries - so my question is: scala 2.10 have support resolving my problem?

Scala and Aspects

Scala and Aspects can be used together? Are there benefits in this case?
Thanks
Scala is just like java, if you mean for example spring-like AOP I'm sure that annotations work either in scala or in java.
On the other hand, the fact that Scala has closures (and java doesn't) makes AOP less interesting.
In fact the Scala IDE for Eclipse uses Aspects (because the JDT assumes Java):
From Scala Support in Eclipse - Monkey-patching the JDT for fun and profit?, p16 by Miles Sabin
AspectJ and Equinox Aspects
A collection of aspects is effectively a patch
AspectJ was used to retrofit the desired extensibility features to the JDT and expose them via public API
The key modification:
The JDT's CompilationUnit is the entry point to it's internal model, but it assumes Java source
An aspect can turn its constructor into a factory method
So the answer is Yes, it is possible. I have to agree with Pablo that it's less attractive than in Java.
Fakod has some examples for AspectJ here
Real-World Scala: Managing Cross-Cutting Concerns using Mixin Composition and AOP