What are configurations in Gradle? - eclipse

When working with dependency resolution in gradle, you usually see something like this:
configurations {
optional
compile
runtime.extendsFrom compile
testCompile.extendsFrom runtime
}
and I wanted to know of what type is optional or compile? Is it a Class? a string? what methods can I call for it?
Besides all this, is there a way to find out these things automatically, similar to ctrl+space when on something in eclipse?

They are classes that implements org.gradle.api.artifacts.Configuration. The Gradle DSL doc also contains more information about the configuration DSL core type.
To find out more info about internal classes etc, which is useful when for instance looking up classes and methods in the Gradle javadoc, it is often as simple as just printing out the class names. Quite often though, you will end up with some internal implementing class instead of the API interface you're interested in, but regardless of that it's a way get started on what to search for. I tend to keep the source code of all open source projects we're using available in the IDE. That way it's easy to jump into the correct class (even when it's not available through context shortcuts) and look around.
To get more information about configurations in your case, you could add a task that simply prints out the relevant info. E.g. something like:
task configInfo << {
println "configurations.class: ${configurations.class}"
println "configurations.compile class: ${configurations.compile.class}"
println "implements ${Configuration} interface? ${configurations.compile instanceof Configuration}"
}
which in my case results in the following output
$ gradle configInfo
:configInfo
configurations.class: class org.gradle.api.internal.artifacts.configurations.DefaultConfigurationContainer_Decorated
configurations.compile class: class org.gradle.api.internal.artifacts.configurations.DefaultConfiguration_Decorated
implements interface org.gradle.api.artifacts.Configuration interface? true

I am no Gradle expert, but this seems like a simple getter delegated to another object in a DSL fashion. You could write the same with something like this:
class MyDsl {
def config = [:].withDefault { false }
void configure(closure) {
closure.delegate = this
closure()
}
def getOptional() { config.optional = true }
def getCompile() { config.compile = true }
def getTest() { config.test = true }
}
dsl = new MyDsl()
dsl.configure {
optional
compile
}
dsl.config.with {
assert optional
assert compile
assert !test
}
You could return some specific object to pass to runtime.extendsFrom() method.
For auto-complete, IIRC that's what groovy-eclipse DSLD (DSL descriptors) are for. You may want to give a try to this gradle DSLD which is in eclipse-integration-gradle plugin.
As per this ticket it has been done long ago.

The question "what type is optional or compile" isn't really valid. That is kind of like asking what type does "instanceof" have. The instanceof keywword doesn't have a type.
When writing code like you cited, you are taking advantage of a DSL. Treat words like compile and optional as keywords in that DSL. Unless you are writing your own DSL (as opposed to taking advantage of existing one, which is what this question is about), don't think of types being associated with those things.
As for the question about ctrl+space, Eclipse won't do anything special with that in this context unless you are using a plugin which provides support for that. Even with plugin support there will still be limitations because you can define your own configurations. If you were going to define a configuration named "jeffrey" and you typed "jeff" followed by ctrl+space, there is no way for the IDE to know you want it to turn that into "jeffrey".
I hope that helps.

Related

Scalastyle "Public method must have explicit type" in Play Framework

We've started experimenting with Scala and the Play framework at my work. Setup our auto-linting and testing framework as the first thing, and have deployed Scalastyle to handle the former.
That has been very useful, except that we are getting this specific lint error that we are finding it difficult to resolve in a good way. A simple example is this:
def helloWorld = Action {
req =>
Ok("Hello World!")
}
Though often it can be much more complex, of course (to the point where it can difficult to figure out what the type actually is).
In either case, this gives us the "Public method must have explicit type" error from Scalastyle.
Unfortunately, setting the expected explicit type here seems typically to cause a syntax error.
Any suggestions on a good solution for this? Or do we just have to turn of this check for Play projects?
Any suggestions on a good solution for this? Or do we just have to turn of this check for Play projects?
I'd suggest to either turn org.scalastyle.scalariform.PublicMethodsHaveTypeChecker rule off completely for your project or mark your controllers to be ignored by this rule (here you'll find info on how to do this).
In the end this check benefit more to people who write libraries (as it helps to be more explicit about api one provide). I found that when you're working on "real" projects check like this does nothing but adding some boilerplate and stops you from leveraging type inference.
I hope this helps. To to Settings -> Editor -> Scala -> Type Annotations. Change the value to 'Add' instead of 'Add & Check' for Public value and method. Then it IDE will not show that warning anymore.
I've found a better way for removing the "Public method must have explicit type" message, without turning it off.
When defining these methods, the body [type] and [implicit] [type] may be set; as Action[JsValue] and implicit RequestHeader for example.
Code example:
def helloWorld:Action[JsValue] = Action {
implicit req: RequestHeader =>
Ok("Hello World!")
}
or
def helloWorld:Action[AnyContent] = Action {
implicit req: RequestHeader =>
Ok("Hello World!")
}

Does Scala have a global object or class?

I know programmers are supposed to wrap their code in an application object:
object Hello extends App {
println("Hello, World")
}
It is required in Eclipse, if I ever want to get any output. However, when I tried to write some code (very casually) in Emacs, I write like this:
class Pair[+T](val first: T, val second: T)
trait Friend[-T] {
def befriend(someone: T)
}
def makeFriendWith(s: Student, f: Friend[Student]) {
f.befriend(s)
}
It seems like there is no universal object or class that wraps over the function makeFriendWith. Is Scala like JavaScript, everything is attached to a global object? If not, what is this function attached to?
Also why can this work in console (I complied it with scala command and it worked) but does not work in Eclipse? What's the use of the Application object?
Scala doesn't have top-level defs, but your script can be run by either the REPL or the scala script runner.
The precise behavior of your script depends on which way you run it.
The REPL can run scripts line-by-line or whole hog. (Compare :paste and :paste -raw versus :load or -i init.script and the future option -I init.script.)
There is an issue about sensitive scripting. The script runner should realize if you're trying to run an App.
There is another effort to make scripting a compiler phase that is easily customized. Scroll to Scripter.scala for code comments about its current heuristics.
In short, your defs must be wrapped in a top-level entity, but exactly how that happens is context-dependent.
There was a recent effort to make an alternative baked-in wrapping scheme available for the REPL.
None of this is mandated by the language spec, any more than special rules pertaining to sbt build files are defined by the language.
You can define methods like this only in the console, which (behind the scenes) automatically wraps them in an anonymous class for you.
Outside of the console, there's no such luxury.
As a JVM language, Scala cannot truly create any top-level entities other than classes and interfaces.
It does, however, have the notion of a "package object" which creates the illusion of value entites (val, var and def) not enclosed in a class or trait.
See http://www.scala-lang.org/docu/files/packageobjects/packageobjects.html for information on package objects.
You can run code like this directly in Eclipse: use Scala worksheet. IntelliJ IDEA Scala plugin supports it as well.

Groovy referencing variable without declaration

Why doesn't eclipse show an error when I use a variable without declaring it?
Edit:
AFAIK dynamic nature only means that type of variable is not known until run time. The variables must still be defined (explicitly or implicitly) before being used. For example - Python which is also a dynamic language reports this as an error.
Edit2:
How does groovy interpret this code so that it still isn't an error?
Because in dynamic languages like groovy, one could have implemented methodMissing() / propertyMissing(). So although such variable or method does not actually exist, it may be still not be an errors until the program is actually run. Such errors can usually only be detected at runtime and hence IDE's usually don't complain about it.
Although to hint you, eclipse is underlining such variables there which it is not able to statically reference.
EDIT :
To explain the concept by code example, just check the method test below. Now IDE can't know that something , that ... can actually be a method in this class.
This vastly helps in building DSLs in groovy.
class TestClass {
def test() {
def a = something.that.didnt.exist()
or how about some random statements that make no sense at all
a = ew Parser().doSomething() ew blah blah blah
}
def propertyMissing(String name) { println "$name"; return this }
def methodMissing(String name, args) { println "$name with $args"; return this }
}
new TestClass().test()
I think you may try to use #CompileStatic tag on method.
Then Eclipse will compile and check errors at compile time or in develop time.
I haven't Eclipse to check this now, so this is just for a proposal.

How do you do dependency injection with the Cake pattern without hardcoding?

I just read and enjoyed the Cake pattern article. However, to my mind, one of the key reasons to use dependency injection is that you can vary the components being used by either an XML file or command-line arguments.
How is that aspect of DI handled with the Cake pattern? The examples I've seen all involve mixing traits in statically.
Since mixing in traits is done statically in Scala, if you want to vary the traits mixed in to an object, create different objects based on some condition.
Let's take a canonical cake pattern example. Your modules are defined as traits, and your application is constructed as a simple Object with a bunch of functionality mixed in
val application =
new Object
extends Communications
with Parsing
with Persistence
with Logging
with ProductionDataSource
application.startup
Now all of those modules have nice self-type declarations which define their inter-module dependencies, so that line only compiles if your all inter-module dependencies exist, are unique, and well-typed. In particular, the Persistence module has a self-type which says that anything implementing Persistence must also implement DataSource, an abstract module trait. Since ProductionDataSource inherits from DataSource, everything's great, and that application construction line compiles.
But what if you want to use a different DataSource, pointing at some local database for testing purposes? Assume further that you can't just reuse ProductionDataSource with different configuration parameters, loaded from some properties file. What you would do in that case is define a new trait TestDataSource which extends DataSource, and mix it in instead. You could even do so dynamically based on a command line flag.
val application = if (test)
new Object
extends Communications
with Parsing
with Persistence
with Logging
with TestDataSource
else
new Object
extends Communications
with Parsing
with Persistence
with Logging
with ProductionDataSource
application.startup
Now that looks a bit more verbose than we would like, particularly if your application needs to vary its construction on multiple axes. On the plus side, you usually you only have one chunk of conditional construction logic like that in an application (or at worst once per identifiable component lifecycle), so at least the pain is minimized and fenced off from the rest of your logic.
Scala is also a script language. So your configuration XML can be a Scala script. It is type-safe and not-a-different-language.
Simply look at startup:
scala -cp first.jar:second.jar startupScript.scala
is not so different than:
java -cp first.jar:second.jar com.example.MyMainClass context.xml
You can always use DI, but you have one more tool.
The short answer is that Scala doesn't currently have any built-in support for dynamic mixins.
I am working on the autoproxy-plugin to support this, although it's currently on hold until the 2.9 release, when the compiler will have new features making it a much easier task.
In the meantime, the best way to achieve almost exactly the same functionality is by implementing your dynamically added behavior as a wrapper class, then adding an implicit conversion back to the wrapped member.
Until the AutoProxy plugin becomes available, one way to achieve the effect is to use delegation:
trait Module {
def foo: Int
}
trait DelegatedModule extends Module {
var delegate: Module = _
def foo = delegate.foo
}
class Impl extends Module {
def foo = 1
}
// later
val composed: Module with ... with ... = new DelegatedModule with ... with ...
composed.delegate = choose() // choose is linear in the number of `Module` implementations
But beware, the downside of this is that it's more verbose, and you have to be careful about the initialization order if you use vars inside a trait. Another downside is that if there are path dependent types within Module above, you won't be able to use delegation that easily.
But if there is a large number of different implementations that can be varied, it will probably cost you less code than listing cases with all possible combinations.
Lift has something along those lines built in. It's mostly in scala code, but you have some runtime control. http://www.assembla.com/wiki/show/liftweb/Dependency_Injection

dynamically create class in scala, should I use interpreter?

I want to create a class at run-time in Scala. For now, just consider a simple case where I want to make the equivalent of a java bean with some attributes, I only know these attributes at run time.
How can I create the scala class? I am willing to create from scala source file if there is a way to compile it and load it at run time, I may want to as I sometimes have some complex function I want to add to the class. How can I do it?
I worry that the scala interpreter which I read about is sandboxing the interpreted code that it loads so that it won't be available to the general application hosting the interpreter? If this is the case, then I wouldn't be able to use the dynamically loaded scala class.
Anyway, the question is, how can I dynamically create a scala class at run time and use it in my application, best case is to load it from a scala source file at run time, something like interpreterSource("file.scala") and its loaded into my current runtime, second best case is some creation by calling methods ie. createClass(...) to create it at runtime.
Thanks, Phil
There's not enough information to know the best answer, but do remember that you're running on the JVM, so any techniques or bytecode engineering libraries valid for Java should also be valid here.
There are hundreds of techniques you might use, but the best choice depends totally on your exact use case, as many aren't general purpose. Here's a couple of ideas though:
For a simple bean, you may as well
just use a map, or look into the
DynaBean class from apache commons.
For more advanced behaviour you could
invoke the compiler explicitly and
then grab the resulting .class file
via a classloader (this is largely
how JSPs do it)
A parser and custom DSL fit well in
some cases. As does bean shell
scripting.
Check out the ScalaDays video here: http://days2010.scala-lang.org/node/138/146
which demonstrates the use of Scala as a JSR-223 compliant scripting language.
This should cover most scenarios where you'd want to evaluate Scala at runtime.
You'll also want to look at the email thread here: http://scala-programming-language.1934581.n4.nabble.com/Compiler-API-td1992165.html#a1992165
This contains the following sample code:
// We currently call the compiler directly
// To reduce coupling, we could instead use ant and the scalac ant task
import scala.tools.nsc.{Global, Settings}
import scala.tools.nsc.reporters.ConsoleReporter
{
// called in the event of a compilation error
def error(message: String): Nothing = ...
val settings = new Settings(error)
settings.outdir.value = classesDir.getPath
settings.deprecation.value = true // enable detailed deprecation warnings
settings.unchecked.value = true // enable detailed unchecked warnings
val reporter = new ConsoleReporter(settings)
val compiler = new Global(settings, reporter)
(new compiler.Run).compile(filenames)
reporter.printSummary
if (reporter.hasErrors || reporter.WARNING.count > 0)
{
...
}
}
val mainMethod: Method = {
val urls = Array[URL]( classesDir.toURL )
val loader = new URLClassLoader(urls)
try {
val clazz: Class = loader.loadClass(...)
val method: Method = clazz.getMethod("main", Array[Class]( classOf[Array[String]] ))
if (Modifier.isStatic(method.getModifiers)) {
method
} else {
...
}
} catch {
case cnf: ClassNotFoundException => ...
case nsm: NoSuchMethodException => ...
}
}
mainMethod.invoke(null, Array[Object]( args ))