Mill: How to add additional Resources to a module - scala

I have some Files out side my Module that I need to have on my classpath for testing.
Listing all possibilities (mill resolve tests._) I think to extend resources is the way to go.
I tried a lot - here my last attempt:
object test extends Tests {
override def resources =
new Sources({
super.resources.self.map(_ :+ (millSourcePath / up / 'data / 'global / 'bpmn))
},
super.resources.ctx
)
...
}
Is overwriting resources the way to go?
How is it done correctly?

resources is a "task of sources" as defined here. Thus, in order to add something to the resources path you can do
override def resources = T.sources {
super.resources() :+ PathRef(millSourcePath / up / 'data / 'global / 'bpm)
}

Related

How can I print all the settings in Test configuration for a project build using SBT?

I have a scala-js project, adding a particular library dependency, is affecting the way project test cases are running. Without the library dependency everything's fine, the moment I add them, tests doesn't execute. I want to check all sbt settings, if those are getting affected. Is there any way I can print all settings and check?
BuildStructure.data seems to give access to all the settings by scope. We could access it by defining a custom command printAllTestSettings like so:
def printAllTestSettings = Command.command("printAllTestSettings") { state =>
val structure = Project.extract(state).structure
val testScope =
Scope(
Select(ProjectRef(new File("/home/mario/sandbox/hello-world-scala/"), "root")),
Select(ConfigKey("test")),
Zero,
Zero
)
structure
.data
.keys(testScope)
.foreach(key => println(s"${key.label} = ${structure.data.get(testScope, key).get}"))
state
}
commands ++= Seq(printAllTestSettings)
Here is output snippet:
...
managedSourceDirectories = List(/home/mario/sandbox/hello-world-scala/target/scala-2.12/src_managed/test)
managedResourceDirectories = List(/home/mario/sandbox/hello-world-scala/target/scala-2.12/resource_managed/test)
testLoader = Task((taskDefinitionKey: ScopedKey(Scope(Select(ProjectRef(file:/home/mario/sandbox/hello-world-scala/,root)), Select(ConfigKey(test)), Zero, Zero),testLoader)))
packageBin = Task((taskDefinitionKey: ScopedKey(Scope(Select(ProjectRef(file:/home/mario/sandbox/hello-world-scala/,root)), Select(ConfigKey(test)), Zero, Zero),packageBin)))
...

Remove or Exclude WatchSource in sbt 1.0.x

Overview
After looking around the internet for a while, I have not found a good way to omit certain folders from being watched by sbt 1.0.x in a Play Framework application.
Solutions posted for older versions of sbt:
How to exclude a folder from compilation
How to not watch a file for changes in Play Framework
There are a few more, but all more or less the same.
And the release notes for 1.0.2 show that the += and ++= behavior was maintained, but everything else was dropped.
https://www.scala-sbt.org/1.x/docs/sbt-1.0-Release-Notes.html
Source code verifies: https://www.scala-sbt.org/1.0.4/api/sbt/Watched$.html
Would love to see if anyone using sbt 1.0.x has found a solution or workaround to this issue. Thanks!
Taking the approach of how SBT excludes managedSources from watchSources I was able to omit a custom folder from being watched like so:
watchSources := {
val directoryToExclude = "/Users/mgalic/sandbox/scala/scala-seed-project/src/main/scala/dirToExclude"
val filesToExclude = (new File(directoryToExclude) ** "*.scala").get.toSet
val customSourcesFilter = new FileFilter {
override def accept(pathname: File): Boolean = filesToExclude.contains(pathname)
override def toString = s"CustomSourcesFilter($filesToExclude)"
}
watchSources.value.map { source =>
new Source(
source.base,
source.includeFilter,
source.excludeFilter || customSourcesFilter,
source.recursive
)
}
},
Here we use PathFinder to get all the *.scala sources from directoryToExclude:
val filesToExclude = (new File(directoryToExclude) ** "*.scala").get.toSet
Then we create customSourcesFilter using filesToExclude, which we then add to every current WatchSource:
watchSources.value.map { source =>
new Source(
...
source.excludeFilter || customSourcesFilter,
...
)
}
Note the above solution is just something that worked for me, that is, I do not know what is the recommend approach of solving this problem.

How to write dependsOn in Custom plugin

I have a task in my build.gradle :
task makeJar(type: Copy) {
delete('dist/')
from('build/intermediates/bundles/release')
into('dist/')
include('classes.jar')
def jarName = new VersionName().getNameWithVersion() + '.jar'
rename('classes.jar', jarName)
}
makeJar.dependsOn('clearTask', build)
Now, I want to remove this from my build.gradle and create a custom plugin like this : MakeJarTask.groovy (this is in eclipse project)
class MakeJarPluginTask extends Copy{
#TaskAction
def makeJar(){
logger.lifecycle("creating a jar *********************")
delete('dist/')
from('build/intermediates/bundles/release')
into('dist/')
include('classes.jar')
def jarName = new VersionName().getNameWithVersion() + '.jar'
rename('classes.jar', jarName)
}
Calling this Task in callGroovy.class (that implements Plugin)
project.tasks.create("makeJarPlugin1", MakeJarPluginTask.class){
dependsOn("clearDist", "build")
}
But this doesn't give the correct output .
The error is in last part, I think this is not the correct way to use dependsOn but I am not able to get how to use this. Any help will be highly appreciated.
Task task = project.tasks.create("makeJarPlugin1", MakeJarPluginTask.class);
task.dependsOn("clearDist", "build")

Potential flaw with SBT's IO.zip method?

I'm working on an SBT plugin where I'd like to zip up a directory. This is possible due to the following method in IO:
def zip(sources: Traversable[(File,String)], outputZip: File): Unit
After tinkering with this method, it seems that simply passing it a directory and expecting the resulting zip file to have the same file & folder structure is wrong.. Passing a directory (empty or otherwise) results in the following:
[error]...:zipper: java.util.zip.ZipException: ZIP file must have at least one entry
Therefore, it appears that the way to get use the zip method is by stepping through the directory and adding each file individually to the Traversable object.
Assuming my understanding is correct, this strikes me as very odd - vey rarely do users need to cherry-pick what is to be added to an archive.
Any thoughts on this?
It seems like you can use this to compose a zip with files from multiple places. I can see the use of that in a build system.
A bit late to the party, but this should do what you need:
val parentFolder: File = ???
val folderName: String = ???
val src: File = parentFolder / folderName
val tgt: File = parentFolder / s"$folderName.zip"
IO.zip(allSubpaths(src), tgt)
Here is some code for zipping directories using sbt's IO class:
IO.withTemporaryDirectory(base => {
val dirToZip = new File(base, "lib")
IO.createDirectory(dirToZip)
IO.write(dirToZip / "test1", "test")
IO.write(dirToZip / "test2", "test")
val zip: File = base / ("test.zip")
IO.zip(allSubpaths(dirToZip), zip)
val out: File = base / "out"
IO.createDirectory(out)
IO.unzip(zip,out) mustEqual(Set(out /"test1", out / "test2"))
IO.delete((out ** "*").get)
//Create a zip containing this lib directory but under a different directory in the zip
val finder: PathFinder = dirToZip ** "*" --- dirToZip //Remove dirToZip as you can't rebase a directory to itself
IO.zip(finder x rebase(dirToZip, "newlib"), base / "rebased.zip")
IO.createDirectory(out)
IO.unzip(base / "rebased.zip",out) mustEqual(Set(out /"newlib"/"test1", out / "newlib"/ "test2"))
})
See the docs
http://www.scala-sbt.org/0.12.2/docs/Detailed-Topics/Mapping-Files.html
http://www.scala-sbt.org/0.12.3/docs/Detailed-Topics/Paths.html
for tips on creating the Traversable object to pass to IO.zip

Get package childs from a package

In Eclipse, how can I get the package's children?
Consider this example:
+ org.stack
org.stack.test
- StackTest.java
- Stack.java
When we do IPackageFragment.getChildren() in org.stack, the Eclipse JDT only returns the compilation unit (Java Files)! But I want all children of a package: all ICompilationUnits and all Packages.
In this example when I apply IPackageFragment.getChildren() in org.stack, I want the org.stack.test and the ICompilationUnit Stack.java...
How can I do this?
IPackageFragment is not the correct starting point. You have to ask a higher level for the packages:
IPackageFragment: A single package. It contains ICompilationUnits or IClassFiles, depending on whether the IPackageFragmentRoot is of type source or of type binary. Note that IPackageFragment are not organized as parent-children. E.g. net.sf.a is not the parent of net.sf.a.b. They are two independent children of the same IPackageFragmentRoot.
Have a look at this article about the AST
Here's some code that should be close to what you needed. (Since we're a bit past 2011, I doubt it will help you much, but maybe it will help somebody else.) Doubtless it can stand some improvement.
Since it doesn't seem possible to directly recurse downward from the IPackageFragment (as mentioned by Kai), the basic idea is to get the higher level IPackageFragmentRoot and filter it's children based on the original fragment's path.
PackageFragment originFragment; // = org.stack's fragment
try {
String fragmentPath = originFragment.getPath().toString();
IJavaElement parent = originFragment.getParent();
ArrayList<IJavaElement> allChildren =
new ArrayList<IJavaElement>();
if (parent instanceof IPackageFragmentRoot) {
IPackageFragmentRoot root = (IPackageFragmentRoot)parent;
IJavaElement[] rootChildren = root.getChildren();
// originsFragments includes the origin and all package
// fragments beneath it
List<IJavaElement> originsFragments =
Arrays.asList(rootChildren).stream()
.filter(c -> c.getPath().toString().startsWith(fragmentPath))
.collect(Collectors.toList());
allChildren.addAll(originsFragments);
// Gather the children of the package fragments
for (IJavaElement o : originsFragments) {
if (o instanceof IPackageFragment ) {
IPackageFragment oFragment = (IPackageFragment)o;
IJavaElement[] fChildren = oFragment.getChildren();
allChildren.addAll(Arrays.asList(fChildren));
}
}
}
} catch (JavaModelException e) {
e.printStackTrace();
}
An alternative inelegant solution would be to start with the original fragment's path and then use Java's file and directory facilities to descend through the directory hierarchy. Then, you can use IJavaProject's findPackageFragment(IPath path) to connect to the proper IPackageFragments.
you need to do it in a recursive way.
here's some pseudo code
findAllClasses(package, classesCollection) {
for(Class c: package.getClasses)
classesCollection.add(c.getResourcePath)
if(package.hasChildPackages)
for(Package p: packages)
findAllClasses(p, classesCollection)
}