I am having trouble understanding aspectJ's compile-time and load-time weaving and figuring out what to use(and how to use ajc) to compile and build my project.
Here's my project structure:-
TestProject : a java service library.
This is being used by a few other
projects. This project do not contain
any aspects.
TestProject-Aspects : Contains just
aspects which advice a few classes in
TestProject. I am not using the
AspectJ5 annotation style and all my
joinpoints are just at the method
execution currently.
My questions:
ajc vs iajc and how are they
different?
Is there any need for weaving?
Will something like this work ?
Compile TestProject-Aspects
<iajc>
sourceroots=${sources.dir}
destdir=${classes.dir}
classpath=${standard.compile.classpath}
</iajc>
Compile TestProject
<iajc>
sourceroots=${sources.dir}
destdir=${classes.dir}
classpath=${standard.compile.classpath}
inpath=${[TestProject-Aspects]pkg.classpath}
</iajc>
Don't I have to use javac at all ?
which I was initially using to compile
TestProject?
ajc and iajc are extensions of the JDT compiler that comes with Eclipse. So, ajc and iajc will produce exactly the same byte code for pure Java as Eclipse would (which contains some minor differences to Oracle's javac).
ajc and iajc are basically the same except that iajc is incremental (that's the i in iajc). This means that the compiler checks time stamps and does a smarter incremental build if possible and avoids full builds (just like when using AJDT inside of eclipse). Other than this functionality, they are essentially the same. See here for more information:
http://www.eclipse.org/aspectj/doc/released/devguide/antTasks-iajc.html
If a project contains no aspects, using the ajc compiler is optional. These projects can be on the inpath of a project that contains aspects. To compile that contain code-style aspects, then you need to use ajc.
Annotation style aspects are a little different. If you are using annotation style for LTW only, then you can use javac to compile them as long as the correct aop.xml is created weaver is available at runtime.
However, annotation style with CTW weaving does require ajc.
In your particular case above, you can compile TestProject using javac as long as it is on the inpath of your aspect project. This would mean that the class files of your TestProject would be re-written and combined with the class files from your aspect project.
Or, if you are using LTW, then you don't need to add your TestProject to any inpath and you can use javac. But, you must set up your application for LTW at runtime.
EDIT
To answer your comment below:
Yes. You can compile your aspects project first using ajc or the iajc task. Then, you can compile your second, pure java project also by using the iajc task and additionally by putting the results of your first project on the aspect path. You cannot use javac for this at all.
The ant build.xml snippet will look something like this:
<project name="simple-example" default="compile" >
<taskdef
resource="org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties">
<classpath>
<pathelement location="${home.dir}/tools/aspectj/lib/aspectjtools.jar"/>
</classpath>
</taskdef>
<target name="compile" >
<iajc sourceroots="${home.dir}/TestProject-Aspects/src"
classpath="${home.dir}/tools/aspectj/lib/aspectjrt.jar"
destDir="${home.dir}/TestProject-Aspects/bin"/>
<iajc sourceroots="${home.dir}/TestProject/src"
classpath="${home.dir}/tools/aspectj/lib/aspectjrt.jar"
destDir="${home.dir}/TestProject/bin"
aspectPath="${home.dir}/TestProject-Aspects/bin"/>
</target>
</project>
See here for more details on iajc:
http://www.eclipse.org/aspectj/doc/released/devguide/antTasks-iajc.html
Related
I'm struggling to make maven to compile classes and put them to the jar. Searching through the internet I found, that it expects standard structure (like src/java), but it can be overridden by adding
<packaging>jar</packaging>
<build>
<sourceDirectory>correct_path_to_source_files</sourceDirectory>
</build>
to pom.xml, but it doesn't help, maven keeps ignoring source files without any error: runnning compile says [INFO] Nothing to compile - all classes are up to date. However, no classes appear in target/classes and in jar as well. Other than packaging and build above, pom file contains only dependencies.
I'm not a big expert of java/scala, so I have no idea, what does this thing expect from me. Manual build in Idea solves the problem, but shouldn't it be done by assembly system?
How to make maven compile source files?
I've managed to get JUnit 4.12 + Hamcrest 1.3 + Mockito 2.8.47 to work in Eclipse so that when I add them as dependencies, my tests will run.
(The way I've done this is using the p2-maven-plugin to bundle the following
artifacts from Maven Central into plugins/a feature and provide them via P2:
junit 4.12
org.mockito.mockito-core 2.8.47
org.hamcrest.all 1.3.0
Adding the plugins to my test fragment as dependencies makes the tests
run in Eclipse.
However, the Tycho build of the same fragment will fail with the
following messages:
java.lang.LinkageError: loader constraint violation: loader (instance of org/eclipse/osgi/internal/loader/EquinoxClassLoader) previously initiated loading for a different type with name "org/hamcrest/Matcher"
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.defineClass(ModuleClassLoader.java:273)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.defineClass(ClasspathManager.java:632)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findClassImpl(ClasspathManager.java:586)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findLocalClassImpl(ClasspathManager.java:538)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findLocalClass(ClasspathManager.java:525)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.findLocalClass(ModuleClassLoader.java:325)
at org.eclipse.osgi.internal.loader.BundleLoader.findLocalClass(BundleLoader.java:345)
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:423)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:372)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:364)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:161)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:12)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
So it seems that some other plugin is loading the package
org.hamcrest.Matcher before my fragment does. This is probably due
to the import/export/partial import/partial export chaos surrounding the
JUnit/Hamcrest/Mockito setup.
Does anyone have an idea -- or even better: a working example -- of how to
get the three components work together both within the IDE (for quick
checks on whether tests run) and Tycho (for checks during the build)?
Seems like that the loader want the dependencies in a bundle.
But I guess you haven't put your test lib in a bundle.
You could try to add them in the dependencies of your product to see how it reacts.
Background
The root of the problem is, that org.junit already has a dependency to org.hamcrest.core. So when your test-plugins has a dependency to org.hamcrest.all (which contains everything of hamcrest-core and all other hamcrest artifacts), all classes specified in hamcrest-core exist twice. Once in the bundle of hamcrest-core and once in hamcrest-all, which is why you get the linkage error.
If you open the Manifest of org.junit in the Manifest-Editor of Eclipse and go to the 'Dependencies' tab it should show you org.hamcreast.core in the "Required Plug-ins" section and org.hamcreast.core should be re-exported. Or in the raw-manifest it should look like this:
Require-Bundle: org.hamcrest.core;bundle-version="1.3.0";visibility:=reexport
Solution 1 - add hamcrest sub-modules
Instead of adding the all hamcrest-modul containing hamcrest.all as dependency to my Eclipse test-bundle/project (via 'Require-Bundle'), I add the hamcrest sub-modules that I require, except for hamcrest-core (because it is already re-exported in my case). For me hamcrest-library was sufficient.
The available hamcrest sub-modules are (according to the org.hamcrest:hamcrest-parent pom, which can be found here: https://repo1.maven.org/maven2/org/hamcrest/hamcrest-parent/1.3/hamcrest-parent-1.3.pom):
hamcrest-core
hamcrest-library
hamcrest-generator
hamcrest-integration
Creating the p2-Repo containing the required bundles
When using Maven and the 'org.reficio:p2-maven-plugin' to build the p2-repo that contains the mentioned test-bundles, the conversion of the maven-artifacts to OSGi-bundles does not produce fully working results by default.
Converting a maven-module to a full OSGi-bundle consists mainly of configuring the MANIFEST.MF to contain proper entries. For this the p2-maven-plugin utilizes "bnd tool".
By default the Java packages provided by all maven dependencies of a maven module are added as optional Imported-package when that module is converted into a OSGi-bundle.
In my case this had the consequence that the org.hamcrest.library bundle refereed to the packages from hamcrest-core only via Import-Package in its MANIFEST.MF.
But unfortunately with only this specified, the Equinox-ClassLoader did not find the classes from hamcrest-core in the test-runtime and threw a corresponding exception. Maybe this is also caused by the fact that hamcrest-core and hamcrest-library have a package "org.hamcrest" and bnd-tools adds the exported packages of a bundle to the imported packages again.
The solution in my case was to instruct the org.reficio:p2-maven-plugin respectively bnd-tools to add org.hamcrest.core as "Require-Bundle" to the Manifest of hamcrest-library. For this, the instructions-element shown below needs to be add to the artifact-element of org.hamcrest:hamcrest-library in the execution-configuration of the 'p2-maven-plugin' in the pom.xml used to build the p2-repo:
<artifact>
<id>org.hamcrest:hamcrest-library:1.3</id>
<instructions>
<Require-Bundle>org.hamcrest.core</Require-Bundle>
</instructions>
</artifact>
If hamcrest sub-modules other than hamcrest-library are are used, the instructions need to be analogous, corresponding to the dependencies listed in their pom.
Edit
Eclipse Orbit provides org.hamcrest.library, org.hamcrest.integrator and org.hamcrest.generator bundles that have have org.hamcrest.core as required bundle (if necessary):
https://download.eclipse.org/tools/orbit/downloads/
Appendix
In the end first solution caused a SecurityException:
java.lang.SecurityException: class "org.hamcrest.Matchers"'s signer information does not match signer information of other classes in the same package
Which is a known issue. The following two solutions avoid this issue and work properly during Tycho builds and from within Eclipse.
Solution 2 - bundle hamcrest sub-module jars with a plug-in
Another approach is to download the jar of the required hamcrest sub-module and bundle it with a Eclipse plugin directly, like it is described here:
https://www.vogella.com/tutorials/Hamcrest/article.html#hamcrest_eclipse
To bundle the jar with a plug-in, include it in the project and add it to the plug-ins classpath. Go to the Runtime-Tab of the Manifest-Editor and klick Add... in the Classpath section and select the jar. This should add the jar to the .classpath, MANIFEST.MF and build.properties file properly.
Make sure the jar is included before the other plug-in dependencies (which include hamcrest-core), as stated in the mentioned tutorial.
If hamcrest should be used in multiple test-projects/fragments, add the jar to a test plug-in all other test-projects depend on.
Solution 3 - use org.hamcrest 2.x
Since hamcrest-2 there is only one org.hamcrest jar/artifact that includes everything from hamcrest. Using hamcrest 2 avoids all the issues and is my preferred solution. Except for the changed packaing of hamcrest the API did not break, so it should be sufficient to just include org.hamcrest:
https://github.com/hamcrest/JavaHamcrest/releases/tag/v2.1
In order to create a p2-repo that includes org.hamcrest-2.2 the following sippet has to be included into the configuration-artifacts element of the p2-maven-plugin execution in the pom.xml:
<artifact>
<id>org.hamcrest:hamcrest-core:2.2</id>
<instructions>
<Require-Bundle>org.hamcrest;bundle-version="2.2.0";visibility:=reexport</Require-Bundle>
</instructions>
</artifact>
<artifact>
<id>org.hamcrest:hamcrest:2.2</id>
</artifact>
The IUs org.hamcrest.core 2.2 and org.hamcrest have to be included in the target-platform to make them available for plug-ins in Eclipse and during. All plug-ins which depend on org.junit now have org.hamcrest also available.
This aproach works because org.hamcrest.core still exists in version 2 stream, even tough it is deprected and empty. Its only purpose is to redirect build-systems to the new org.hamcrest-2.x jar/artifact. Therefore org.hamcrest.core-2.2 specifies a compile dependency to
org.hamcrest-2.2 in its pom.xml. Unfortunately the p2-maven-plugin dosn't translate it directly into a bundle-requirement for org.hamcrest in the manifest, but with the sippet above enforces that.
Because org.junit requires the bundle org.hamcrest.core with a minimal version of 1.3 (but without upper-bound) it uses the present org.hamcrest.core-2.2 . org.hamcrest.core-2.2 again requires org.hamcrest-2.2 and re-exports it. This makes org.junit use org.hamcrest-2.2 in the end and because org.junit re-exports hamcrest-core it also provides org.hamcrest-2.2 immediately to all depended plug-ins.
Note
If you want to play around with different variants of a jar, don't forget to clear (means delete on the drive) the bundle pools of Maven (in <your-home>/.m2/repository/p2/osgi/bundle/ and Eclipse PDE (in <your-workspace>/.metadata/.plugins/org.eclipse.pde.core/.bundle_pool/) in between. Otherwise you will always use the first one, because jar's with the same version are not updated.
javah is used to generate C headers from #native methods. It operates on compiled class files, so it requires a classpath as an argument. For this reason, it seems sensible to make a javah task depend on fullClasspathin Compile.
The issue I am facing is that the generated headers are needed in order to build a native library, and the native library needs to be a resource. But, because it is a resource, it will be included in fullClasspath in Compile, which leads to a circular dependency.
Does SBT have a classpath key that includes all .class files but excludes resources?
I just discovered sbt-jni, a very interesting new SBT plugin which simplifies working with JNI from SBT.
When reading some source code, I stumbled over
this line
, which seems to refer to the problem you are facing. If I understand it correctly, the work-around in sbt-jni is to combine dependencyClasspath in Compile, compile in Compile, and classDirectory in Compile instead of using fullClasspath in Compile. I'm not sure if this will exactly result in a classpath that includes all .class files, but maybe something like that works for you as well.
Suppose I have a Scala compile-time macro that I find useful and would like to share it (I do). How do I create a JAR file that when loaded into another project would execute the macro when compiling the new project?
Specifically, I've made a StaticAnnotation that rewrites the AST of the class that it wraps before compile time. This works in my Maven build (macro defined in the main directory, runs on test cases in the test directory) because I have
<compilerPlugins>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_2.10.5</artifactId>
<version>2.1.0-M5</version>
</compilerPlugin>
</compilerPlugins>
in my scala-maven-plugin. (I'm starting with a Scala 2.10 project and if it works, will provide both 2.10 and 2.11.)
But if I put the resulting JAR on a Scala console classpath, in a Scala script, or into another Maven project (without special compiler plugins), it simply ignores the macro: the AST does not get overwritten and my compile-time println statements don't execute. If I use the #compileTimeOnly annotation on my macro (new in Scala 2.11), then it complains with the #compileTimeOnly error message.
Do I really need to tell my users to add compiler plugins in their pom.xml files, as well as alternate instructions for SBT and other build tools? Other packages containing macros (MacWire, Log4s) don't come with complicated build instructions: they just say, "point to this dependency in Maven Central." I couldn't find the magic in their build process that makes this work. What am I missing?
If you're relying on a macro-paradise-only feature then yes, you do need to tell your users to add compiler plugins. See http://docs.scala-lang.org/overviews/macros/annotations.html . The projects you mention are only using the scala compiler's built-in (non-paradise) macro features, not macro annotations.
I can't launch a scala jar; when I launch it I get the error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/immutable/List" which seems to mean the scala library is not loaded...
this is a screenshot showing a lot of informations on the artifact window.
here is the manifest:
Manifest-Version: 1.0
Class-Path: libs/scala-library-2.10.0.jar libs/commons-logging-1.1.1.j
ar libs/jcip-annotations-1.0.jar libs/jwnl-1.4_rc3.jar libs/laf-plugi
n-7.2.1.jar libs/laf-widget-7.2.1.jar libs/miglayout-core-4.2.jar lib
s/miglayout-swing-4.2.jar libs/scala-actors.jar libs/scala-library.ja
r libs/scala-swing.jar libs/slf4j-api-1.6.4.jar libs/slick_2.10-1.0.0
.jar libs/sqlite-jdbc-3.7.2.jar libs/substance-7.2.1.jar libs/trident
-7.2.1-swing.jar
Main-Class: Fenetre
and when I enter "java xf myJar.jar", there are extracted files in the directory:
- .class files
- in the libs folder, there are the libraries INCLUDING scala-library.jar & scala-library-2.10.0.jar(I specified only one of these two files in the manifest to avoid conflicts)
can you help me?
I'm new to Scala and don't know what the problem is however I've been compiling "fat jars" which include all the required libs.
I've been using https://github.com/sbt/sbt-assembly to do this successfully.
Despite what your manifest is telling you, when you run the application you either do not have the scala-library included in your class path or there's some confusion when you attempt to import List. Scala should automatically import the immutable collection classes in your project with the root Predef implementation.
Predef provides type aliases for types which are commonly used, such as the immutable collection types scala.collection.immutable.Map, scala.collection.immutable.Set, and the scala.collection.immutable.List constructors (scala.collection.immutable.:: and scala.collection.immutable.Nil). The types Pair (a scala.Tuple2) and Triple (a scala.Tuple3), with simple constructors, are also provided.
Predef in core Scaladocs
Try printing the classpath from within your app to confirm.
Although not pertinent to your question, I would recommend using SBT for dependency management now that IntelliJ IDEA 13 has full SBT integration support. Your collaborators not using IntelliJ will also be happier because SBT gives them more options for build technologies, editors, and other tooling when working on the project.