I have a multi module scala project in Gradle. Everything works great with it, except the ScalaDoc. I would like to generate a single 'uber-scaladoc' with all of the libraries cross-linked. I'm still very new to groovy/gradle, so this is probably a 'me' problem. Any assistance getting this setup would be greatly appreciated.
build.gradle (in the root directory)
// ...
task doScaladoc(type: ScalaDoc) {
subprojects.each { p ->
// do something here? include the project's src/main/scala/*?
// it looks like I would want to call 'include' in here to include each project's
// source directory, but I'm not familiar enough with the Project type to get at
// that info.
//
// http://www.gradle.org/docs/current/dsl/org.gradle.api.tasks.scala.ScalaDoc.html
}
}
The goal here would be able to just run 'gradle doScalaDoc' at the command line and have the aggregate documentation show up.
Thanks in advance.
Related
This may be a silly question but I really don't know where to look.
I'm creating a browser testing environment for a pretty large-scale API written in typescript. This API uses esbuild to build the typescript files into a /dist/ folder with a single index.js entry-point and its appropriate d.ts file.
I've created a /tests/ folder to hold some browser files that includes an index.html file with Mocha and Chai imported. It also imports /dist/index.js which is set globally to a window.myAPI variable.
In /tests/index.html:
import * as myAPI from "./dist/index.js"
Alongside index.html in the tests folder, there are separate JS files included for different tests that run things on window.myAPI... to do assertion tests.
search.test.js
book.test.js
navigate.test.js
I then run a server to host at the root. These separate tests are then imported from /tests/index.html. The separate tests look like this inside:
const { chai, mocha } = window;
const { assert } = chai;
describe("Search", function() {
describe("Setup", function() {
it("Setting URL should work", function() {
const call = myAPI.someCall()
assert.ok(call);
});
});
});
mocha.run();
Everything works, but I have no code hinting for myAPI. I'd like to be able to see what functions are available when I type myAPI, and what parameters they take, and what they should return - along with all my comments on each function.
In typescript you can do things like ambient declarations, but I don't want to make my tests typescript because then I add an unnecessary build step to the tests. But it would be as easy as:
/// <reference path = "/dist/index.d.ts" />
How can I tell VSCode that window.myAPI is an import of /dist/index.js and should import the types as well so I can see them ?
I'm open to different solutions to this, but I feel like this should be pretty simple. I don't know if ESLint is capable of doing something like this, but I tagged it because I feel it's relevant.
Thanks!
I am writing a plug-in that will generate unit tests for a Java class that is selected in Eclipse's Project Explorer. This plug-in uses a third-party program called Randoop to generate the tests, so I make this happen using ProcessBuilder:
ProcessBuilder builder = new ProcessBuilder(command);
where the command that is passed to the ProcessBuilder is a list of Strings, something like
["java", "-classpath", "path1;path2;etc", "randoop.main.Main", ...]
Within the plug-in I am trying to generate the classpath for Randoop based on the classpath that Eclipse knows about. Here is some of what I have so far:
IClasspathEntry[] resolvedClasspath = javaProject.getResolvedClasspath(true);
for (IClasspathEntry entry : resolvedClasspath) {
if (entry.getEntryKind() == IClasspathEntry.CPE_SOURCE) {
IPath outputLocation = entry.getOutputLocation();
if (outputLocation != null) {
buf.append(outputLocation.toString());
}
else {
buf.append(entry.getPath().toString());
}
}
else {
buf.append(entry.getPath().toString());
}
buf.append(CLASSPATH_SEP);
}
It isn't quite right. It seems to specify the library jar files okay, but doesn't do so well with identifying the paths to class files corresponding to CPE_SOURCE entries. For example, I see a classpath entry of /myPkgFragRoot/src/main/java instead of myPkgFragRoot/target/classes.
I seem to have a muddled picture of how Eclipse treats classpaths, so I'm looking for some help. Firstly, I'm wondering if my high-level approach is wrong. It seems like I am writing a large amount of code to generate an incorrect classpath. Is there some simpler way of getting a classpath from an IJavaProject than getting the results of getResolvedClasspath and iterating through them and manipulating the individual entries? Secondly, if there isn't a simpler way, how should I be locating the class files produced by building the project?
If the outputLocation is null, you have to use the default output location javaProject.getOutputLocation() instead of entry.getPath().
See Javadoc of IClasspathEntry.getOutputLocation():
Returns:
the full path [...], or null if using default output folder
If in Project > Properties: Java Build Path tab Source the check box Allow output folders for source folders is not checked, IClasspathEntry::getOutputLocation() will always return null.
TL;DR: Is there a way how to specify the order in which the Babel plugins are supposed to be run? How does Babel determine this order? Is there any spec how this works apart from diving into Babel sources?
I'm developing my own Babel plugin. I noticed, that when I run it, my plugin is run before other es2015 plugins. For example having code such as:
const a = () => 1
and visitor such as:
visitor: {
ArrowFunctionExpression(path) {
console.log('ArrowFunction')
},
FunctionExpression(path) {
console.log('Function')
},
}
my plugin observes ArrowFunction (and not Function). I played with the order in which the plugins are listed in Babel configuration, but that didn't change anything:
plugins: ['path_to_myplugin', 'transform-es2015-arrow-functions'],
plugins: ['transform-es2015-arrow-functions', 'path_to_myplugin'],
OTOH, this looks like the order DOES somehow matter:
https://phabricator.babeljs.io/T6719
---- EDIT ----
I found out that if I write my visitor as follows:
ArrowFunctionExpression: {
enter(path) {
console.log('ArrowFunction')
}
},
FunctionExpression: {
exit(path) {
console.log('Function')
}
},
both functions are called. So it looks like the order of execution is: myplugin_enter -> other_plugin -> myplugin_exit. In other words, myplugin seems to be before other_plugin in some internal pipeline. The main question however stays the same - the order of plugins in the pipeline should be determined & configurable somehow.
The order of plugins is based on the order of things in your .babelrc with plugins running before presets, and each group running later plugins/presets before earlier ones.
The key thing though is that the ordering is per AST Node. Each plugin does not do a full traversal, Babel does a single traversal running all plugins in parallel, with each node processed one at a time running each handler for each plugin.
Basically, what #loganfsmyth wrote is correct; there is (probably) no more magic in plugin ordering itself.
As for the my problem specifically, my confusion was caused by how arrow function transformation works. Even if the babel-plugin-transform-es2015-arrow-functions plugin mangles the code sooner than my plugin, it does not remove the original arrow-function ast node from the ast, so even the later plugin sees it.
Learning: when dealing with Babel, don't underestimate the amount of debug print statements needed to understand what's happening.
Given the following generator folder structure; I'm attempting to deep copy all folders under the 'for_copy' folder.
generator root
app
templates
for_copy
data
external
media
All the folders are empty. I would like to have this structure created for me when I invoke the generator.
I have tried using fs.copy, bulkCopy, and bulkDirectory. None of them are doing the job.
Any clues as to how I might achieve this would be greatly appreciated.
See below code snippet:
writing: function() {
this.log('Writing templates...');
//doesn't work
this.fs.copy(
this.templatePath('for_copy'),
this.destinationRoot()
);
//doesn't work
this.bulkCopy(
this.templatePath('for_copy'),
this.destinationRoot()
);
//doesn't work
this.bulkDirectory(
this.templatePath('for_copy'),
this.destinationRoot()
);
//doesn't work
this.bulkDirectory(
this.templatePath('for_copy') +'**/*',
this.destinationRoot()
);
}
Yeoman only cares about files. When you use bulkDirectory, you actually copy files over, not directories.
You can use mkdirp module to create directories.
mkdirp.sync(path.join(this.destinationPath(), 'data'));
mkdirp.sync(path.join(this.destinationPath(), 'external'));
// etc ...
Id like to generate HTML-Output as stated in: http://etorreborre.github.io/specs2/guide/org.specs2.guide.Runners.html via the JUnit-Runner
The documentation states that ''you can use the -Dspecs2.commandline property and pass it the html or console values.''
How to do that in a gradle task.
Yes, I know that gradle can produce HTML-Test-Reports but I need the one specs2 can generate.
One problem still remains. I don#t get a navigation menu on the left. Each page has a seperate menu and the results
Just use the systemProperty whitout the -D
test {
systemProperty "specs2.commandline" , "html"
systemProperty "specs2.outDir" , "$buildDir/reports/specs2" // to put the reports in the same dir as other gradle reports
}