Generating single ScalaDoc for multiple sub-projects using Gradle - scala

I'm using Gradle (v2.3) for a project, that contains multiple Scala sub-projects. Generating the ScalaDoc for each sub-project individually works as expected (running gradle :project-a:scaladoc or gradle :project-b:scaladoc).
But how do a get a single ScalaDoc of all Scala sub-projects? The minimal project below leads to a "Cannot invoke method withInputStream() on null object" error, when executing
gradle scaladoc --info:
Starting Build
Compiling settings file '/tmp/gradle-scaladoc-test/settings.gradle' using StatementExtractingScriptTransformer.
Compiling settings file '/tmp/gradle-scaladoc-test/settings.gradle' using BuildScriptTransformer.
Settings evaluated using settings file '/tmp/gradle-scaladoc-test/settings.gradle'.
Projects loaded. Root project using build file '/tmp/gradle-scaladoc-test/build.gradle'.
Included projects: [root project 'some project', project ':project-a', project ':project-b']
Evaluating root project 'some project' using build file '/tmp/gradle-scaladoc-test/build.gradle'.
Compiling build file '/tmp/gradle-scaladoc-test/build.gradle' using StatementExtractingScriptTransformer.
Compiling build file '/tmp/gradle-scaladoc-test/build.gradle' using BuildScriptTransformer.
Evaluating project ':project-a' using empty build file.
Evaluating project ':project-b' using empty build file.
All projects evaluated.
Selected primary task 'scaladoc' from project :
Tasks to be executed: [task ':scaladoc', task ':project-a:compileJava', task ':project-a:compileScala', task ':project-a:processResources', task ':project-a:classes', task ':project-a:scaladoc', task ':project-b:compileJava', task ':project-b:compileScala', task ':project-b:processResources', task ':project-b:classes', task ':project-b:scaladoc']
:scaladoc (Thread[main,5,main]) started.
:scaladoc
Executing task ':scaladoc' (up-to-date check took 0.448 secs) due to:
No history is available.
:scaladoc FAILED
:scaladoc (Thread[main,5,main]) completed. Took 0.768 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':scaladoc'.
> Cannot invoke method withInputStream() on null object
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output.
BUILD FAILED
Total time: 5.198 secs
Stopped 0 compiler daemon(s).
Directory structure:
├─ project-a/
│   └─ src/
│   └─ main/
│   └─ scala/
│   └─ package_a/
│   └─ A.scala
├─ project-b/
│   └─ src/
│   └─ main/
│   └─ scala/
│   └─ package_b/
│   └─ B.scala
├─ build.gradle
└─ settings.gradle
build.gradle
subprojects {
repositories {
mavenCentral()
}
apply plugin: 'scala'
dependencies {
compile 'org.scala-lang:scala-library:2.11.6'
}
tasks.withType(ScalaCompile) {
scalaCompileOptions.additionalParameters = ['-unchecked', '-deprecation', '-feature']
scalaCompileOptions.useAnt = false
}
}
task scaladoc(type: org.gradle.api.tasks.scala.ScalaDoc) {
group = 'Documentation'
description = 'Aggregated ScalaDoc documentation.'
title = 'Title of documentation'
destinationDir = new File(buildDir, "aggregated-api")
source subprojects.collect { project ->
project.sourceSets.main.allScala
}
scalaClasspath = files(subprojects.collect { project ->
project.sourceSets.main.compileClasspath
})
classpath = files(subprojects.collect { project ->
project.sourceSets.main.compileClasspath
})
}
settings.gradle
rootProject.name = 'some project'
include 'project-a', 'project-b'
A.scala
package package_a
case class A(value: Int)
B.scala
package package_b
case class B(value: Int)
The only similar problem I could find is "Gradle Fails to Compile Basic Scala Project", but it doesn't help here (and is about Gradle 1.3).

I know it's been a while since this has been asked, but since I just had the exact same problem...
I solved it the following way:
task aggregatedScaladocs(type: ScalaDoc, description: 'Generate scaladocs from all child projects as if it were a single project', group: 'Documentation') {
destinationDir = file("$buildDir/docs/scaladoc")
title = "$project.name $version API"
subprojects.each { proj ->
proj.tasks.withType(ScalaDoc).each {
source += proj.sourceSets.main.allJava
source += proj.sourceSets.main.allScala
classpath += proj.sourceSets.main.compileClasspath
excludes += scaladoc.excludes
includes += scaladoc.includes
}
}
}
Hope it'll help somebody at some point.

For Gradle 6.x I needed to use slightly modified version of answer by #juwi.
task aggregatedScaladocs(type: ScalaDoc, description: 'Generate scaladocs from all child projects as if it were a single project', group: 'Documentation') {
destinationDir = file("$buildDir/docs/scaladoc")
title = "$project.name $version API"
classpath = project.files([])
scalaClasspath = project.files([])
subprojects.each { proj ->
proj.tasks.withType(ScalaDoc).each {
source proj.sourceSets.main.allJava
source proj.sourceSets.main.allScala
classpath += proj.scaladoc.classpath
scalaClasspath += proj.scaladoc.scalaClasspath
exclude proj.scaladoc.excludes
include proj.scaladoc.includes
}
}
}

Related

Building a package with generated Python files

Problem statement
When building a Python package I want the build tool to automatically execute the steps to generate the necessary Python files and include them in the package.
Here are some details about the project:
the project repository contains only the hand-written Python and YAML files
to have a fully functional package the YAML files must be compiled into Python scripts
once the Python files are generated from YAMLs, the program needed to compile them is no longer necessary (build dependency).
the hand-written and generated Python files are then packaged together.
The package would then be uploaded to PyPI.
I want to achieve the following:
When the user installs the package from PyPI, all necessary files required for the package to function are included and it is not necessary to perform any compile steps
When the user checks-out the repository and builds the package with python -m build . --wheel, the YAML files are automatically compiled into Python and included in the package. Compiler is required.
When the user checks-out the repository and installs the package from source, the YAML files are automatically compiled into Python and installed. Compiler is required.
(nice to have) When the user checks-out the repository and installs in editable mode, the YAML files are compiled into Python. The user is free to make modifications to both generated and hand-written Python files. Compiler is required.
I have a repository with the following layout:
├── <project>
│ └── <project>
│ ├── __init__.py
│ ├── hand_written.py
│ └── specs
│ └── file.ksc (YAML file)
└── pyproject.toml
And the functional package should look something like this
├── <project>
│ └── <project>
│ ├── __init__.py
│ ├── hand_written.py
│ └── generated
│ └── file.py
├── pyproject.toml
└── <other package metadata>
How can I achieve those goals?
What I have so far
As I am very fresh to Python packaging, I have been struggling to understand the relations between the pyproject.toml, setup.cfg and setup.py and how I can use them to achieve the goals I have outlined above. So far I have a pyproject.toml with the following content:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "<package>"
version = "xyz"
description = "<description>"
authors = [ <authors> ]
dependencies = [
"kaitaistruct",
]
From reading the setuptools documentation, I understand that there are the build commands, such as:
build_py -- simply copies Python files into the package (no compiling; works differently in editable mode)
build_ext -- builds C/C++ modules (not relevant here?)
I suppose adding the compile steps for the YAML files will involve writing a setup.py file and overwriting a command, but I don't know if this is the right approach, whether it will even work, or if there are better methods, such as using a different build backend.
Alternative approaches
A possible alternative approach would be to manually compile the YAML files prior to starting the installation or build of the package.

What is the simplest way to create an importable file in Scala?

TLDR: What is the simplest way to create an importable file in Scala?
This is a beginner's question.
I've been learning Scala for a few weeks and now have some code that I would like to share between a couple of files/different projects. I have tried a lot of import structures but none of them worked. The requirements are:
File to be imported should reside in a totally different directory.
File to be imported should be importable by independent projects.
File to be imported is a single .scala file.
(Optional) file to be imported should contain defs, objects and case classes.
Example:
File to be imported location: /some/path/to_be_imported.scala.
File using project (1) location: /abc/def/will_import01.scala.
File using project (2) location: /xyz/rst/will_import02.scala.
I'm not trying to create a package or distribute it.
See how I would address this considering the programming language I already know:
Since I'm versed in Python I'll give an expected version of the answer should this problem refer o Python:
In that case you could:
Put your file on the same directory of your executed file then just run: python3 ./your_file.py. For instance:
➜ another_path|$ python3 ./main_module/main_file.py
1
self printing
➜ another_path|$ tree .
=======================================================================
.
└── main_module
├── main_file.py
├── __pycache__
│   └── sample_file_to_be_imported.cpython-36.pyc
└── sample_file_to_be_imported.py
Notice that they are in the exact same directory (this contradicts point 2 above nevertheless it solves the problem).
Add the directory of your file to the PYTHONPATH environment variable then run your module (best answer):
➜ random_path|$ PYTHONPATH=$PYTHONPATH:./sample_module python3 ./main_module/main_file.py
1
self printing
=======================================================================
➜ random_path|$ tree .
.
├── main_module
│   └── main_file.py
└── sample_module
├── __pycache__
│   └── sample_file_to_be_imported.cpython-36.pyc
└── sample_file_to_be_imported.py
3 directories, 3 files
Content of the files:
➜ random_path|$ cat ./main_module/main_file.py
from sample_file_to_be_imported import func1, Class01
print(func1())
x = Class01()
x.cprint()
=======================================================================
➜ random_path|$ cat ./sample_module/sample_file_to_be_imported.py
def func1():
return 1
class Class01():
def cprint(self):
print('self printing')
Edit 01: #felipe-rubin answer does not work:
$ scala -cp /tmp/scala_stack_exchange/ myprogram.scala
/tmp/scala_stack_exchange/path01/myprogram.scala:3: error: not found: value Timer
val x = Timer(1)
^
one error found
=======================================================================
➜ path01 tree /tmp/scala_stack_exchange
/tmp/scala_stack_exchange
├── anotherpath
│   ├── Timer.class
│   └── timer.scala
└── path01
└── myprogram.scala
2 directories, 3 files
=======================================================================
$ cat /tmp/scala_stack_exchange/anotherpath/timer.scala
class Timer(a: Int) {
def time(): Unit = println("time this")
}
=======================================================================
$ cat /tmp/scala_stack_exchange/path01/myprogram.scala
import anotherpath.Timer
val x = Timer(1)
x.time()
The simplest way would be to compile a .scala file with scalac:
Linux/OSX: scalac mypackage/Example.scala
Windows: scalac mypackage\Example.scala
The above should generate a .class file (or more).
Assuming the file contains a class called Example you can import it somewhere else like this:
import mypackage.Example
When compiling another file which does the above import, you will need to have 'mypackage' in the classpath. You can add directories to the classpath when calling scalac by using the -cp flag like:
Linux/OSX: scalac -cp .:path/to/folder/where/mypackage/is/located AnotherExample.scala
Windows: scalac -cp .;path\to\folder\where\mypackage\is\located AnotherExample.scala
Doing this for bigger projects gets complicated, in which case you might resort to a build tool (e.g. SBT) or an IDE (e.g. IntelliJ Idea) to do the complicated work for you.
Other notes:
If you don't have scalac, you can get it from the scala website ('download binaries' option)
the -cp flag stand for "classpath". There is also a -classpath flag which does the same thing
Welcome to Scala :)
I finally got this working. Thanks for the valuable input from the other answers.
I have diversified the name of every path, file and object to be as general as possible. This probably does not follow the guidelines of the scala community but is the most explicit, illustrative help I could find. Project layout:
File Layout
$ tree /tmp/scala_stack_exchange
/tmp/scala_stack_exchange
├── anotherpath
│   ├── file_defines_class.scala
│   └── some_package_name
│   ├── MyObj.class
│   └── MyObj$.class
└── path01
└── myprogram.scala
3 directories, 4 files
Where I want to run myprogram.scala which should import classes defined in file_defines_class.scala.
Preparation
Compile the file you want to be imported by other modules:
cd /tmp/scala_stack_exchange/anotherpath && scalac ./file_defines_class.scala
Execution
cd /tmp/scala_stack_exchange/path01 && scala -cp /tmp/scala_stack_exchange/anotherpath/ ./myprogram.scala
Results
myobj time
Contents of the files
// /tmp/scala_stack_exchange/path01/myprogram.scala
import some_package_name.MyObj
val x = new MyObj(10)
x.time()
// /tmp/scala_stack_exchange/anotherpath/file_defines_class.scala
package some_package_name
object MyObj
case class MyObj(i: Int) {
def time(): Unit = println("myobj time")
}
Feels like magic. However this whole process is rather cumbersome :/

native package refuses to place libraries in the sysroot folder

I have a package (openssl) that must be built for the host and the target. It creates some .so and .a libraries that some other packages need for runtime and compilation time respectively.
When I compile this package for the target everything works fine and every file ends up in the place I tell it to go, but when I compile for the host (${PN}-native target) it just doesn't put the libraries in the host sysroot directory (./build/tmp/sysroot/x86_64-linux).
This is the recipe:
SUMMARY = "Secure Socket Layer"
SECTION = "libs/network"
LICENSE = "openssl"
LIC_FILES_CHKSUM = "file://LICENSE;md5=4004583eb8fb7f89"
branch = "yocto"
SRC_URI = "git://www.myserver.com/openssl.git;protocol=ssh;branch=${branch}"
SRCREV = "${AUTOREV}"
S = "${WORKDIR}/git"
BBCLASSEXTEND += "native nativesdk"
# This is because I am porting this package from other project and I can't modify it.
FILES_${PN} += "${libdir}/libssl.so ${base_libdir}/libcrypto.so"
FILES_SOLIBSDEV = ""
do_compile() {
${MAKE}
}
do_install() {
DESTDIR=${D} ${MAKE} install
}
Could anyone let me know what I am doing wrong? Thanks in advance
First, why are you writing your own recipe for openssl instead of using the one in oe-core?
Anyway the problem is that at no point do you tell the recipe what prefix to use. In native builds the prefix is what relocates the package correctly into the native sysroot.
Ok, I know what the problem is:
It seems like for native recipes, you have to install it using the full path to your host sysroot inside the image folder. This means that when compiling for the target, the image folder looks like this:
$ tree -d
/openssl/1.0.0-r0/image
├── lib
└── usr
├── include
│   └── openssl
└── lib
but for the host, it looks like this in my case:
$ tree -d
openssl-native/1.0.0-r0/image
└── home
└── xnor
└── yocto
└── build
└── tmp
└── sysroots
└── x86_64-linux
├── lib
└── usr
├── include
│   └── openssl
└── lib
EDIT
The right solution is to modify the Makefile to take ${prefix}, ${bindir}, ${libdir}, etc. from the environment instead of hardcoding those paths in the Makefile. In my case this is not possible because of the project requirements, so I have to do this:
SUMMARY = "Secure Socket Layer"
SECTION = "libs/network"
LICENSE = "openssl"
LIC_FILES_CHKSUM = "file://LICENSE;md5=4004583eb8fb7f89"
branch = "yocto"
SRC_URI = "git://www.myserver.com/openssl.git;protocol=ssh;branch=${branch}"
SRCREV = "${AUTOREV}"
S = "${WORKDIR}/git"
BBCLASSEXTEND += "native nativesdk"
# This is because I am porting this package from other project and I can't modify it.
FILES_${PN} += "${libdir}/libssl.so ${base_libdir}/libcrypto.so"
FILES_SOLIBSDEV = ""
do_compile() {
${MAKE}
}
do_install() {
# The change is here!
DESTDIR=${D}${base_prefix} ${MAKE} install
}
and as you can imagine, ${base_prefix} expands to "/home/xnor/yocto/build/tmp/sysroots/x86_64-linux/" for the host (openssl-native) recipe and to "" for the target (openssl).

Defining sbt task that invokes method from project code?

I'm using SBT to build a scala project. I want to define a very simple task, that when I input generate in sbt:
sbt> generate
It will invoke my my.App.main(..) method to generate something.
There is a App.scala file in myproject/src/main/scala/my, and the simplified code is like this:
object App {
def main(args: Array[String]) {
val source = readContentOfFile("mysource.txt")
val result = convert(source)
writeToFile(result, "mytarget.txt");
}
// ignore some methods here
}
I tried to add following code into myproject/build.sbt:
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
my.App.main(Array())
}
But which doesn't compile since it can't find my.App.
Then I tried to add it to myproject/project/build.scala:
import sbt._
import my._
object HelloBuild extends Build {
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
App.main(Array())
}
}
But it still can't be compiled, that it can't find package my.
How to define such a task in SBT?
In .sbt format, do:
lazy val generate = taskKey[Unit]("Generate my file")
fullRunTask(generate, Compile, "my.App")
This is documented at http://www.scala-sbt.org/0.13.2/docs/faq.html, “How can I create a custom run task, in addition to run?”
Another approach would be:
lazy val generate = taskKey[Unit]("Generate my file")
generate := (runMain in Compile).toTask(" my.App").value
which works fine in simple cases but isn't as customizable.
Update: Jacek's advice to use resourceGenerators or sourceGenerators instead is good, if it fits your use case — can't tell from your description whether it does.
The other answers fit the question very well, but I think the OP might benefit from mine, too :)
The OP asked about "I want to define a very simple task, that when I input generate in sbt will invoke my my.App.main(..) method to generate something." that might ultimately complicate the build.
Sbt already offers a way to generate files at build time - sourceGenerators and resourceGenerators - and I can't seem to notice a need to define a separate task for this from having read the question.
In Generating files (see the future version of the document in the commit) you can read:
sbt provides standard hooks for adding source or resource generation
tasks.
With the knowledge one could think of the following solution:
sourceGenerators in Compile += Def.task {
my.App.main(Array()) // it's not going to work without one change, though
Seq[File]() // a workaround before the above change is in effect
}.taskValue
To make that work you should return a Seq[File] that contains files generated (and not the empty Seq[File]()).
The main change for the code to work is to move the my.App class to project folder. It then becomes a part of the build definition. It also reflects what the class does as it's really a part of the build not the artifact that's the product of it. When the same code is a part of the build and the artifact itself you don't keep the different concerns separate. If the my.App class participates in a build, it should belong to it - hence the move to the project folder.
The project's layout would then be as follows:
$ tree
.
├── build.sbt
└── project
├── App.scala
└── build.properties
Separation of concerns (aka #joescii in da haus)
There's a point in #joescii's answer (which I extend in the answer) - "to make it a separate project that other projects can use. To do this, you will need to put your App object into a separate project and include it as a dependency in project/project", i.e.
Let's assume you've got a separate project build-utils with App.scala under src/main/scala. It's a regular sbt configuration with just the Scala code.
jacek:~/sandbox/so/generate-project-code
$ tree build-utils/
build-utils/
└── src
└── main
└── scala
└── App.scala
You could test it out as a regular Scala application without messing up with sbt. No additional setup's required (and frees your mind from sbt that might be beneficial at times - less setup is always of help).
In another project - project-code - that uses App.scala that is supposed to be a base for the build, build.sbt is as follows:
project-code/build.sbt
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
my.App.main(Array())
}
Now the most important part - the wiring between projects so the App code is visible for the build of project-code:
project-code/project/build.sbt
lazy val buildUtils = RootProject(
uri("file:/Users/jacek/sandbox/so/generate-project-code/build-utils")
)
lazy val plugins = project in file(".") dependsOn buildUtils
With the build definition(s), executing generate gives you the following:
jacek:~/sandbox/so/generate-project-code/project-code
$ sbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/sandbox/so/generate-project-code/project-code/project
[info] Updating {file:/Users/jacek/sandbox/so/generate-project-code/build-utils/}build-utils...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Updating {file:/Users/jacek/sandbox/so/generate-project-code/project-code/project/}plugins...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/so/generate-project-code/build-utils/target/scala-2.10/classes...
[info] Set current project to project-code (in build file:/Users/jacek/sandbox/so/generate-project-code/project-code/)
> generate
Hello from App.main
[success] Total time: 0 s, completed May 2, 2014 2:54:29 PM
I've changed the code of App to be:
> eval "cat ../build-utils/src/main/scala/App.scala"!
package my
object App {
def main(args: Array[String]) {
println("Hello from App.main")
}
}
The project structure is as follows:
jacek:~/sandbox/so/generate-project-code/project-code
$ tree
.
├── build.sbt
└── project
├── build.properties
└── build.sbt
Other changes aka goodies
I'd also propose some other changes to the code of the source generator:
Move the code out of main method to a separate method that returns the files generated and have main call it. It'll make reusing the code in sourceGenerators easier (without unnecessary Array() to call it as well as explicitly returning the files).
Use filter or map functions for convert (to add a more functional flavour).
The solution that #SethTisue proposes will work. Another approach is to make it a separate project that other projects can use. To do this, you will need to put your App object into a separate project and include it as a dependency in project/project, OR package it as an sbt plugin ideally with this task definition included.
For an example of how to create a lib that is packaged as a plugin, take a look at snmp4s. The gen directory contains the code that does some code generation (analogous to your App code) and the sbt directory contains the sbt plugin wrapper for gen.

How to compile with all .jars in .m2?

I'm trying debug mvn compile of a file with many dependencies with javac.
This is how I'm trying to do this:
CLASSPATH=`find ~/.m2 -name *.jar -print0`; javac -verbose BogusFile.java
My problem is that I'm not sure how to tell find to separate each jar found with the unix file separator (:).
Maybe the -printf has the solution?
Sorry I can not answer your question but give a possible other solution approach.
If you need to build a classpath for your maven project you can run the copy-dependencies goal of the Maven Dependency Plugin on your project:
mvn dependency:copy-dependencies
Maven will copy all dependencies for your project (also transitive) to the target/dependency directory and classpath can be set to target/dependency/*; (you still have to include your artifact jar).
Example:
Code:
import org.apache.commons.lang.WordUtils;
import org.apache.log4j.Logger;
public class Bogus {
private static final Logger LOG = Logger.getLogger(Bogus.class);
public static void main(final String[] args) {
LOG.debug(WordUtils.capitalize("hello world"));
}
}
Directory:
C:.
│
├───src
│ └───main
│ └───java
│ Bogus.java
│
└───target
└───dependency
commons-lang-2.6.jar
log4j-1.2.16.jar
Compile Command:
.....\bogus>javac -cp target\dependency\*; src\main\java\Bogus.java
Result:
C:.
│
├───src
│ └───main
│ └───java
│ Bogus.class
│ Bogus.java
│
└───target
└───dependency
commons-lang-2.6.jar
log4j-1.2.16.jar
Join all the jars by : separator, to use as the classpath for the compiler.
export TEST_CLASSPATH=$(find ~/.m2 -name *.jar | sed -z 's/\n/:/g')
javac -classpath $TEST_CLASSPATH:./ BogusFile.java