How to setup annotation processor in maven-compiler-plugin with no derived marker on generated java files - eclipse

Currently all generated java files and gen folder are marked as derived so I'm not able to checkin these files. Is there any compiler arg or confguration which I can use to disable this marking in maven-compiler-plugin?
If it is not possible to achieve it with maven-compiler-plugin is there any plugin which could generate the code without marking the files as derived?

Related

Eclipse: compile errors when processing maven-apt-plugin-generated source files

I have a project that uses JPA QueryDSL. For that we use the com.mysema.maven:maven-apt-plugin to generate the so-called Q-classes (i.e. Java-code) in Maven phase generate-sources. The generated files are placed into directory <project_root>/target/generated-sources.
Subsequently, in phase process-sources we execute the org.codehaus.mojo:build-helper-maven-plugin to add that directory to the source path.
Finally that Java code then gets compiled in phase compile.
This setup works all fine when executing the build on the Maven command line.
When I try to run the build inside Eclipse I always get compile errors in some of the generated classes stating that it can not find one of the other generated classes that it refers to (some of the generated classes refer to other generated classes).
That class which is flagged as missing, however, does exist, but it seems as if the Eclipse compiler starts compiling the generated sources right away before the generation of all those files is complete, in this case before a class A is generated to which the generated class B refers to.
This is using the currently latest Eclipse version (2022-12) and I have defined M2E lifecycle mappings to execute these two plugins on configuration but not on incremental builds:
...
<execute>
<runOnConfiguration>true</runOnConfiguration>
<runOnIncremental>false</runOnIncremental>
</execute>
...
(I have also given it a try to run them on incremental builds but that didn't change anything.)
Any idea how to teach the Eclipse compiler to wait with compilation until ALL generated files are fully generated? Or any other idea why the Eclipse compiler apparently doesn't "see" files that have been generated?
Hope I could make myself clear...
With Eclipse >=v2022-09 (according to #howlger, I myself only tested it with v2022-12) the issue is fixed by specifying <m2e.apt.activation>jdt_apt</m2e.apt.activation> in the pom.xml's properties section.
Caveats:
for our multi-module project I had to select Maven --> Disable Workspace resolution for several modules that still showed errors despite that setting.
with these settings the build now succeeds and shows no more errors - albeit only after a few days of "burn-in" (c.f. my misc. comments). Initially those "class-not found" errors re-surfaced for a few days, but then for some reason disappeared. Maybe that disappearance has to do with my removal of the old "lifecycle-mappings" for said plugin from the pom.xml.

Generating XML Resources into Classpath using Annotation Processors

I am currently working on a Gradle 3.3 project in Intellij 15.0.6.
I am using the Gradle APT plugin to add annotation processors to my classpath.
It works fine when generating Java class files, however I need to be able to generate XML sources within the resources directory equivalent in the build directory's generated directory.
Here is my build directory structure currently:
Project Build Directory Image
As you can see, it does not include a resources directory, which I suspect is what may be causing this problem.
The current exception I receive from running my annotation processor via ./gradlew assemble is: java.lang.IllegalArgumentException: Resource creation not supported in location CLASS_PATH
The code I am using within my annotation processor to generate the xml file:
FileObject source = processingEnv.getFiler()
.createResource(StandardLocation.CLASS_PATH, "", "ap-test-2.html");
Note: I used an HTML extension just as a test, XML should produce the same results.
javax.tools.StandardLocation has other output locations as well:
The SOURCE_OUTPUT location worked to place the XML within the same package as the generated Java classes, within src/apt/main. This is not my desired behaviour however. I need them to reside within the classpath.
I have not found this exception discussed anywhere else after extensive research.
Any help is appreciated. Thank you for reading this question.
StandardLocation.CLASS_PATH is only for input, not output. The only output locations are SOURCE_OUPUT (the build/generated/source/apt/… folder), CLASS_OUTPUT (the standard Gradle build/classes/…), and NATIVE_HEADER_OUPUT. See https://docs.oracle.com/javase/8/docs/api/javax/tools/StandardLocation.html
JavaC has no notion of classes vs. resources outputs, but if you run your annotation processor during your compilation then CLASS_OUTPUT should work (Gradle should then copy everything into the final directory/JAR). See https://docs.oracle.com/javase/8/docs/technotes/tools/unix/javac.html

Don't load/scan class files from a specific jar

I'd like to know how to configure the maven-bundle-plugin (backed by bnd) to completely ignore the classes contained within an embedded jar.
Background
I'm working in a controlled environment where the environment my code is running on is defined by a single company (including all the tools). The code is java and uses OSGi to define module dependencies.
Some of the provided modules contain what look like invalid class files, I can only assume that the system will 'correct' these class files before it tries to load them into any type of JVM. In any case these class files work when deployed onto the target system.
I'm trying to create a build system based on Maven that can produce packages the system understands and have hit a problem where these invalid class files are being read by BND (via apache-felix) which causes errors.
I'd like a way to have the jars that contain these class files on the class path of the bundle but where the contained .class files aren't read/processed by bnd. I could settle for simply ignoring the errors and continuing but can't find a way to do that either without felix aborting the entire build phase.
I just found the -failok directive, don't know why I didn't find it before. Adding <_failok>true</_failok> to the instructions allows me to continue working.
See instructions-ref

#MappedSuperclass static weaving with EclipseLink and multiple jars

My entities objects are scattered in multiple jars.
In jar A I have a base class name MyBase which is annotated with #MappedSuperclass.
In jar B there is an entity class which derives from MyBase.
The problem is that because the weaving is done in the context of the jar file (I'm using the maven plugin) the base class (MyBase) isn't instrumented (although it should).
If I move the derived class from jar B to A then the weaving process will handle the base as well.
Since I'm working on a large project it is critical for me to develop in a modular way.
Doesn't EclipseLink support such methodology?
The only way I found to override this limitation is to add a temporary entity class to the jar where the #MappedSuperclass base class is defined and remove it after the weaving procedure.
Sad, but true ;-)
I'm not sure on the maven plugin, but you should be able to use the static weaver on both jars, you will need to call it twice to weave both, and will need both jars on the weavers classpath for both calls.
Alternatively you can specify the jar containing your superclass as inpath - as explained here and here:
Managing multiple projects
Building AspectJ source code requires two distinct phases; compiling
the source in .java and .aj files to generate .class files, and then
applying the aspects to the generated .class files. This second phase,
known as weaving, is the key difference between AspectJ and Java
compilers. The Java compilation process is controlled by the classpath
setting, which makes types available for resolution by the compiler.
The same classpath setting is used by the AspectJ compilation process
and it is configured in exactly the same way in Eclipse. However, this
setting is not sufficient to control both the compilation and weaving
steps in all situations. This is why there are two extra settings
available for AspectJ projects.
First, there is the inpath setting. Anything specified here will be
made available to the weaver and so any aspects that apply will be
woven in. Entries can be added to a project's inpath by right-clicking
on the project, selecting Properties, then going to the AspectJ InPath
section. Entries can be either JAR files or directories (class
folders), such as the bin directory of another project. Anything on
the inpath is sent to the project's output, after potentially being
woven with aspects.
The second additional setting is the aspectpath. Whereas the inpath
controls the list of things that get woven, the aspectpath controls
what is woven into that list. In other words, any aspects specified on
the aspectpath are made available to the weaving process, just as if
they were present in source form in the project. This setting is
controlled from the AspectJ Aspect Path property page and can contain
either JAR files or directories.
An output JAR setting is also present in the AspectJ section of each
project's property page. This setting causes the compiler to output
class files directly to a JAR file, instead of to the project's output
folder.
Drove me crazy just like you - hope this helps. ;)

Eclipse: script compiler as part of a project

This question is not limited to lex and yacc, but how can I add a custom script compiler as part of a project? For example, I have the following files in the project:
grammar.y
grammar.l
test.script
The binary 'script_compiler' will be generated using grammar.y and grammar.l compiled by lex, yacc and g++. And then I want to use that generated script_compiler to compile test.script to generate CompiledScript.java. This file should be compiled along with the rest of the java files in the project. This setting is possible with XCode or make, but is it also possible with Eclipse alone? If not, how about together with Maven plugin?
(I might setup the script compiler as a separate project, but it would be nice if they can be put in the same project so that changes to the grammar files can be applied immediately)
Thanks in advance for your help!
You can add a custom "Builder" from the project properties dialog. This can be an ant script (with an optional target) or any other script or executable.
There are also maven plugins for ant and other scripting languages
If you just want to run an external program in Maven this is what you want: http://mojo.codehaus.org/exec-maven-plugin/ -- you can then run Maven targets from your IDE or command line and it should do the right thing either way.
To integrate with the normal compilation bind the plugin to the "generate-sources" phase and add the location where the Java files are generated to the "sourceRoot" option of the exec plugin. That way the compiler will pick them up.
Ideally you generate the code into a folder "target/generated-sources/MY_SCRIPT_NAME". That is the standard location for generated sources in the Maven world and e.g. IntelliJ IDEA will pick up source files inside of that location. Note that this doesn't work if the files are directly in "target/generated-sources".
The other option is to write your own Maven plugin, which is actually quite easy as well. See e.g. https://github.com/peterbecker/maven-code-generator