Can gwt analyze dependency like maven does? - gwt

How can I get GWT to provide the same dependency insights as mvn dependency:analyze?
Maven can report about dependencies (Used undeclared dependencies and Unused declared dependencies). I'd like to get GWT to do the same because determining missing inherits in my gwt.xml proves difficult.
Is there a good way for the system to analyze dependency state?
Thanks
Peter

I'm not aware of any such tool, and while I think a utility to analyze and report on GWT dependencies could be interesting, I also think it would be difficult to define well.
Used, undeclared dependencies
Before trying to solve this, what is the problem? In maven, this category means that a Class is loaded from a dependency that isn't directly depended on, but instead is transitively loaded. This starts to get into the whole issue of transitive dependencies (which exist in GWT) and scopes (which don't). If A uses a class in C, but only depends on B, which depends on C, this will be listed in the 'used, undeclared dependencies' list.
In GWT however, we very rarely list every single dependency we use directly. Instead, we assume that transitive dependencies will stay transitive - we dont bother to inherit com.google.gwt.user.RemoteService for RPC as long as we already have com.google.gwt.user.User listed.
So how could we tell if we are using an undeclared dependency, the kind that warns when we do a gwt:compile? Perhaps such a tool could find every .gwt.xml file on the classpath, and read through its <source> and <super-source> rules to look for somewhere that a class we are using is declared? Or in the case of invoking GWT.create on something and getting back a non-concrete type, it could look for <replace-with> and <generate-with> rules. As long as your code already compiles in Java, the classes are on the classpath, but you still run the risk that while the class is there, the .java or .gwt.xml files might not be.
Unused, declared dependencies
This seems like an easier problem - analyze the modules we are inheriting, and look through them for any module that could be pruned out. Unfortunately, as the above discussion notes, we can't just look for the classes and which package they are in, which <source> and <super-source> elements are unused - we also would need to look for <replace-with> and <generate-with> rules - consider something like com.google.gwt.user.RemoteService, which only adds a rule and some configuration details, or even com.google.gwt.user.RemoteServiceObfuscateTypeNames, which modifies only a single setting of the RemoteService module. If RemoteServiceObfuscateTypeNames were removed, everything would still compile, but now there might be information about your RPC classes compiled into your app that you don't expect to be there.
With these in mind, perhaps such a tool could watch all possible rebind rules in the current build, and all configuration settings, properties, etc, and see if any of those rules were not using during a gwt:compile process. Then, indicate which modules had unused parts, and if any module (and all of its inherited modules) were unused, it could shown to the user as able to be removed.
One more important piece: order matters, when defining <inherits> statements. If I add a inherits for com.google.gwt.logging.Logging, then follow it with com.google.gwt.logging.LoggingDisabled, logging classes will be on the source path and will compile, but will have no effect. But if those are ordered the other way around, then they will not only be on the source path, but will all be functional. So any analysis of modules used and unused would need to also include transitive inherits statements, and their orders.

Related

How to get rid of this useless GWT compiler warning?

There is an option in GWT to obfuscate enum names:
<set-configuration-property name="compiler.enum.obfuscate.names" value="true" />
When I use this option, my compiles produce a warning:
[WARN] Call to Enum method name when enum obfuscation is enabled:
com/google/gwt/dom/client/DataTransfer.java:127
Looking at the DataTransfer source code, I can see that it has an enum DropEffect and the enum names are used in one of the methods (setDropEffect).
I'm using modern JsInterop-oriented GWT, so I'm not using this DataTransfer class (or anything in gwt-user.jar). It's annoying to have a useless warning. Is there an easy way to get rid of it?
I know how to exclude files in my own source folders.
Is it possible to exclude source folders from gwt-user? (Other than by physically deleting files from the jar!)
That looks like a bug - unfortunate too that the compiler doesn't remove this class. This should be easy to fix with a patch to GWT, adding a field to DropEffect for the name, since it must never be removed even when that compiler flag is enabled.
Is it possible to exclude source folders from gwt-user? (Other than by physically deleting files from the jar!)
What you must do is exclude the .gwt.xml file itself, by not inheriting it at all, even transitively. So, for example, if you do avoid all JSNI including Widget types in the User module, your .gwt.xml shouldn't reference the Dom module, or User, or anything else which references Dom. But once a module has been inherited, there is no way to "uninherit" it. And once a module is inherited, its sources are added, and cannot be un-added.

Yocto: how to remove/blacklist some dependency from RDEPENDS of a package?

I have a custom machine layer based on https://github.com/jumpnow/meta-wandboard.
I've upgraded the kernel to 4.8.6 and want to add X11 to the image.
I'm modifying to image recipe (console-image.bb).
Since wandboard is based on i.MX6, I want to include the xf86-video-imxfb-vivante package from meta-fsl-arm.
However, it fails complaining about inability to build kernel-module-imx-gpu-viv. I believe that happens because xf86-video-imxfb-vivante DEPENDS on imx-gpu-viv which in turn RDEPENDS on kernel-module-imx-gpu-viv.
I realize that those dependencies have been created with meta-fsl-arm BSP and vanilla Poky distribution. But those things are way outdated for wandboard, hence I am using the custom machine layer with modern kernel.
The kernel is configured to include the Vivante DRM module and I really don't want the kernel-module-imx-gpu-viv package to be built.
Is there a way to exclude it from RDEPENDS? Can I somehow swear my health to the build system that I will take care of this specific run-time dependency myself?
I have tried blacklisting 'kernel-module-imx-gpu-viv' setting PNBLACKLIST[kernel-module-imx-gpu-viv] in my local.conf, but that's just a part of a solution. It helps avoid build failures, but the packaging process still fails.
IIUC you problem comes from these lines in img-gpu-viv recipe:
FILES_libgal-mx6 = "${libdir}/libGAL${SOLIBS} ${libdir}/libGAL_egl${SOLIBS}"
FILES_libgal-mx6-dev = "${libdir}/libGAL${SOLIBSDEV} ${includedir}/HAL"
RDEPENDS_libgal-mx6 += "kernel-module-imx-gpu-viv"
INSANE_SKIP_libgal-mx6 += "build-deps"
I would actually qualify this RDEPENDS as a bug, usually kernel module dependencies are specified as RRECOMMENDS because most modules can be compiled into the kernel thus making no separate package at all while still providing the functionality. But that's another issue.
There are several ways to fix this problem, the first general route is to tweak RDEPENDS for the package. It's just a bitbake variable, so you can either assign it some other value or remove some portion of it. In the first case it's going to look somewhat like this:
RDEPENDS_libgal-mx6 = ""
In the second one:
RDEPENDS_libgal-mx6_remove = "kernel-module-imx-gpu-viv"
Obviously, these two options have different implications for your present and future work. In general I would opt for the softer one which is the second, because it has less potential for breakage when you're to update meta-fsl-arm layer, which can change imx-gpu-viv recipe in any kind of way. But when you're overriding some more complex recipe with big lists in variables and you're modifying it heavily (not just removing a thing or two) it might be easier to maintain it with full hard assignment of variables.
Now there is also a question of where to do this variable mangling. The main option is .bbappend in your layer, that's what appends are made for, but you can also do this from your distro configuration (if you're maintaining your own distro it might be easier to have all these tweaks in one place, rather than sprayed into numerous appends) or from your local.conf (which is a nice place to quickly try it out, but probably not something to use in longer term). I usually use .bbappend.
But there is also a completely different approach to this problem, rather than fixing package dependencies you can also fix what some other package provides. If for example you have a kernel configured to have imx-gpu-viv module built right into the main zimage you can do
RPROVIDES_kernel-image += "kernel-module-imx-gpu-viv"
(also in .bbappend, distro configuration or local.conf) and that's it.
In any case your approach to fixing this problem should reflect the difference between your setup and recipe assumptions. If you do have the module, but in a different package, then go for RPROVIDES, if you have some other module providing the same functionality to libgal-mx6 package then fix libgal-mx6 dependencies (and it's better to fix them, meaning not only drop something you don't need, but also add things that are relevant for your setup).

sbt-assembly: prefix extracted files from some jars

In JOGL, there are lots of native jars for different OS x arch combinations. JOGL has several of its own mechanisms to load the right ones if you aren't using java.library.path, and supports a kind of "fat jar" layout.
In a fat jar layout, any native libraries need to be in a subdirectory ./natives/os.and.arch/. However, since the native jars themselves don't have any internal layout, similarly named so/dylib/dll files collide the flat hierarchy in the final jar.
From what I can tell, I don't think I want to de-duplicate with any of the given MergeStrategy because it's only invoked if there is a collision. The layout is mandatory per JOGL's native library loaders - I want to invoke it every time. Is there a mechanism that can allow me to map certain jar -> prefix/with/path in sbt-assembly?
Example
jogl-all-2.1.3-natives-android-armv6.jar is pulled in through a dependency.
$ jar -tf jogl-all-2.1.3-natives-linux-amd64.jar
META-INF/MANIFEST.MF
libjogl_mobile.so
libnewt.so
I'd like this to go here in the final jar:
./natives/
./natives/linux.and.amd64/
./natives/linux.and.amd64/libnewt.so
./natives/linux.and.amd64/libjogl_mobile.so
From what I can tell, I don't think I want to de-duplicate with any of the given MergeStrategy because it's only invoked if there is a collision. The layout is mandatory per JOGL's native library loaders - I want to invoke it every time.
All merge strategies are invoked every time. MergeStrategy.deduplicate, which is the default strategy for most files, just happens to take effect only if there's a collision.
MergeStrategy.rename, applied for README and license files by default for example, will rename the file every time by appending the jar name.
Is there a mechanism that can allow me to map certain jar -> prefix/with/path in sbt-assembly?
There's no strategy out of the box that does exactly that, but you can define a custom strategy similar to MergeStrategy.rename.
Just follow this rule as Xerxes explained here. There is then no longer any risk of collision. The official JogAmp forum is a better place to ask questions about all JogAmp APIs. If you don't follow my advice, GlueGen will be unable to extract and load the correct native libraries. In your case, natives/linux-amd64 is correct whereas natives/linux.and.amd64 isn't.

Strange behaviour when importing types in Scala 2.10

Today I cleared my .ivy cache and cleaned my project output targets. Since then I have been getting really strange behaviour when running tests with SBT or editing in the Scala IDE.
Given the following:
package com.abc.rest
import com.abc.utility.IdTLabel
I will get the following error:
object utility is not a member of package com.abc.rest.com.abc
Notice that com.abc is repeated twice, so it appears that the compiler uses the context of the current package when doing the import (maybe it's supposed to do this, but I never noticed it before).
Also, if I try to access classes in package com.abc from anywhere inside com.abc.rest (even using the full path) the compiler will complain that the type can not be found.
It appears that the errors only occur when I try to include files from parent packages. What I do find strange is that my code used to work. It only started happening after I cleaned up my project and my ivy cache, so maybe a later version of the compiler is more strict than the previous one.
I would love some ideas on what I can be doing wrong, or how I can go about troubleshooting this.
Update:
By first importing the parent classes and then defining the current package, the problem goes away:
import com.abc.utility.IdTLabel
import com.abs._
package com.abc.rest {
// Define classes belonging to com.abc.rest here
}
So this works, but I would still love to know why on earth the other way around worked, and then stopped working, and how on earth I can fix it. I had a good look, and could find no packages, objects or traits by the name of com anywhere inside the parent package.
Update relating to Worksheets:
Scala worksheets belonging to the same package share the same scope, which sounds obvious, but wasn't. Worksheets are not sand-boxed - they can see the project, and the project can see them. So all the 'test' object, traits, and classes you create inside the worksheet files, also becomes visible in the rest of the project.
I have so many worksheets that I did not even try to see where the problem came in. I simply moved them all to their own package, and like magic, the problem went away.
So, lesson learned for the day: If you create stuff inside worksheets, it's visible from outside the worksheet.
Anyway, this new found knowledge will come in handy, meaning anything 'interesting' can be build, monitored and tweaked inside the worksheet, while the rest of the project can actually use it. Quite cool actually.
It's still interesting to think how a sbt clean and cleaned up ivy cache managed to highlight the problem that was hidden before, but hey, that's another story....
(At the request of JacobusR, I'm making a proper answer out of my earlier comments).
This can happen if you have defined some class/trait/object inside package com.abc.rest.com. As soon as package com.abc.rest.com exists, and given that you are in package com.abc.rest, com would designate com.abc.rest.com as opposed to _root_.com. Fastest (but non-conclusive) way to check, without even scanning the source files, is to look for any .class files in the "com/abc/rest/com" sub-folder.
In particular you would get this behaviour if any of your files has duplicate package definitions (as in package com.abc.rest; package com.abc.rest; ...). If you have this duplicate package clause somewhere in the same file where you get the error, you wouldn't even see anything fishy with the .class files, as the failure at compiling the file would prevent the generation of .class files for any class definition inside the file.
The final bit of useful information is that as you found out the scala Worksheets are not sandboxed, and what you define in the worksheets affects your project's code (rather than only having the project's code affecting the worksheet). So a duplicate package clause in a worksheet could very well cause the error you got.
If package names conflict, there might be a custom error message for that. See if specifying the full path resolves the issue by starting from __root__. Ex. import __root__.com.foo.bar._

Error with Groovy AST transformations when cleaning project in Eclipse

I'm trying to work through groovy's Implementing Local AST Transformations tutorial, but whenever I clean my project I get this error in each file that has the #WithLogging annotation in it:
Groovy:Could not find class for Transformation Processor AC.LoggingASTTransformation declared by AC.WithLogging
So you have a package named "AC" that contains both "WithLogging.groovy" and "LoggingASTTransformation.groovy" classes? Does it also contain any classes that implement the "WithLogging" interface?
If so, I'd suggest you move the class(es) that use your annotation to a location outside of the annotation defining package (the default will suffice, for diagnostic purposes) - Order of compilation matters with transformations. See this post on the groovy users mailing list for more on that.
Also try changing the annotation from #WithLogging to #AC.WithLogging.
As far as cleaning with Eclipse is concerned, I had a similar issue and found that I had to make a trivial modification after a clean to any file that contained my annotation. IE, add a space somewhere. Then save the file. This should rebuild everything properly.