Java compiler provides incremental build, so javac ant task as well. But most other processes don't.
Considering build processes, they transform some set of files (source) into another set of files (target).
I can distinct two cases here:
Transformator cannot take a subset of source files, only the whole set. Here we can only make lazy build - if no files from source was modified - we skip processing.
Transformator can take a subset of sources files and produce a partial result - incremental build.
What are ant internal, third-party extensions or other tools to implement lazy and incremental build?
Can you provide some widespread buildfile examples?
I am interested this to work with GWT compiler in particular.
The uptodate task is Ant's generic solution to this problem. It's flexible enough to work in most situations where lazy or incremental compilation is desirable.
I had the same problem as you: I have a GWT module as part of my code, and I don't want to pay the (hefty!) cost of recompiling it when I don't need to. The solution in my case looked something like this:
<uptodate property="gwtCompile.mymodule.notRequired"
targetfile="www/com.example.MyGwtModule/com.example.MyGwtModule.nocache.js">
<srcfiles dir="src" includes="**"/>
</uptodate>
<target name="compile-mymodule-gwt" unless="gwtCompile.mymodule.notRequired">
<compile-gwt-module module="com.example.MyGwtModule"/>
</target>
Related to GWT, it's not possible to do incremental builds because the GWT compiler looks at all the source code at once and optimizes and inlines code. This means code that wasn't changed could be evaluated differently, for example if you start using a method from a class that wasn't changed, the method was in the previous compilation step left out, but now needs to be compiled in.
Related
I have common files in several projects and I need to stop from compiling some lines and functions for project A, meanwhile it should be compiled for project B.
I know that I can use preprocessor. But it's not convenient for me. Is there any way to stop lines of code from compiling with condition like below?
#if PhotosModuleSettings.type == .documents
... do not commpile
#endif
What's not convenient about using the preprocessor? You can specify the preprocessor macros in build settings of each target, or you can use .xcconfig files to specify them.
There's another simple way to do it, however. Separate the lines and functions that you want to conditionally compile into separate files. Maybe by using Swift extensions or subclassing or just separate global functions, etc..whatever. Then just choose which target(s) and/or project(s) you want those files added as membership.
Depending on your desire to refactor your code to make such a file separation, the preprocessor macros may be the better way to go, though.
You will need to make use of pre processor macros.
Add a configuration for your project, and use that in the pre processor macros.
You can set the value for these configuration in the pre processor macros section for your targets based on your build configuration.
Here is a detailed blog related to the same concept
It often comes up during testing and debugging a Scala project built with sbt that I need to pass some extra compiler flags for a particular file. For example -Xlog-implicits to debug implicit resolution problems. However, changing scalacOptions either in build.sbt or the console invalidates the cache and causes the whole project / test suite to be recompiled. In addition to being annoying to wait so long, this also means that a lot of noise from irrelevant files is printed. Instead it would be better if I could compile a specific file with some extra flags from the sbt console, but I did not find a way to do this.
Problem
The reason why changing the scalac options triggers recompilation is because Zinc, Scala's incremental compiler, cannot possibly now which compiler flags affect the semantics of the incremental compilation, so it's pessimistic about it. I believe this can be improved, and some whitelisted flags can be supported, so that next time people like you don't have to ask about it.
Nevertheless, there's a solution to this problem and it's more generic than it looks like at first sight.
Solution
You can create a subproject in your sbt build which is a copy of the project you want to "log implicits" in, but with -Xlog-implicits enabled by default.
// Let's say foo is your project definition
lazy val foo = project.settings(???)
// You define the copy of your project like this
lazy val foo-implicits = foo
.copy(id = "foo-implicits")
.settings(
target := baseDirectory.value./("another-target"),
scalacOptions += "-Xlog-implicits"
)
Note the following properties of this code snippet:
We redefine the project ID because when we reuse a project definition the ID is still the same as the previous one (foo in this case). Sbt fails when there are projects that share the same id.
We redefine the target directory because we want to avoid recompilation. If we keep the same as before, then recompiling foo-implicits will delete the compilation products of the previous compilation (and viceversa). That's exactly what we want to avoid.
We add -Xlog-implicits to the Scalac options as you request in this question. In a generic solution, this piece of code should go away.
When is this useful?
This is useful not only for your use case, but when you want to have different modules of the same project in different Scala versions. The following has two benefits:
You don't use cross-compilation in sbt ++, which is known to have some memory issues.
You can add new library dependencies that only exist for a concrete Scala version.
It has more applications, but I hope this addresses your question.
In JOGL, there are lots of native jars for different OS x arch combinations. JOGL has several of its own mechanisms to load the right ones if you aren't using java.library.path, and supports a kind of "fat jar" layout.
In a fat jar layout, any native libraries need to be in a subdirectory ./natives/os.and.arch/. However, since the native jars themselves don't have any internal layout, similarly named so/dylib/dll files collide the flat hierarchy in the final jar.
From what I can tell, I don't think I want to de-duplicate with any of the given MergeStrategy because it's only invoked if there is a collision. The layout is mandatory per JOGL's native library loaders - I want to invoke it every time. Is there a mechanism that can allow me to map certain jar -> prefix/with/path in sbt-assembly?
Example
jogl-all-2.1.3-natives-android-armv6.jar is pulled in through a dependency.
$ jar -tf jogl-all-2.1.3-natives-linux-amd64.jar
META-INF/MANIFEST.MF
libjogl_mobile.so
libnewt.so
I'd like this to go here in the final jar:
./natives/
./natives/linux.and.amd64/
./natives/linux.and.amd64/libnewt.so
./natives/linux.and.amd64/libjogl_mobile.so
From what I can tell, I don't think I want to de-duplicate with any of the given MergeStrategy because it's only invoked if there is a collision. The layout is mandatory per JOGL's native library loaders - I want to invoke it every time.
All merge strategies are invoked every time. MergeStrategy.deduplicate, which is the default strategy for most files, just happens to take effect only if there's a collision.
MergeStrategy.rename, applied for README and license files by default for example, will rename the file every time by appending the jar name.
Is there a mechanism that can allow me to map certain jar -> prefix/with/path in sbt-assembly?
There's no strategy out of the box that does exactly that, but you can define a custom strategy similar to MergeStrategy.rename.
Just follow this rule as Xerxes explained here. There is then no longer any risk of collision. The official JogAmp forum is a better place to ask questions about all JogAmp APIs. If you don't follow my advice, GlueGen will be unable to extract and load the correct native libraries. In your case, natives/linux-amd64 is correct whereas natives/linux.and.amd64 isn't.
How can I get GWT to provide the same dependency insights as mvn dependency:analyze?
Maven can report about dependencies (Used undeclared dependencies and Unused declared dependencies). I'd like to get GWT to do the same because determining missing inherits in my gwt.xml proves difficult.
Is there a good way for the system to analyze dependency state?
Thanks
Peter
I'm not aware of any such tool, and while I think a utility to analyze and report on GWT dependencies could be interesting, I also think it would be difficult to define well.
Used, undeclared dependencies
Before trying to solve this, what is the problem? In maven, this category means that a Class is loaded from a dependency that isn't directly depended on, but instead is transitively loaded. This starts to get into the whole issue of transitive dependencies (which exist in GWT) and scopes (which don't). If A uses a class in C, but only depends on B, which depends on C, this will be listed in the 'used, undeclared dependencies' list.
In GWT however, we very rarely list every single dependency we use directly. Instead, we assume that transitive dependencies will stay transitive - we dont bother to inherit com.google.gwt.user.RemoteService for RPC as long as we already have com.google.gwt.user.User listed.
So how could we tell if we are using an undeclared dependency, the kind that warns when we do a gwt:compile? Perhaps such a tool could find every .gwt.xml file on the classpath, and read through its <source> and <super-source> rules to look for somewhere that a class we are using is declared? Or in the case of invoking GWT.create on something and getting back a non-concrete type, it could look for <replace-with> and <generate-with> rules. As long as your code already compiles in Java, the classes are on the classpath, but you still run the risk that while the class is there, the .java or .gwt.xml files might not be.
Unused, declared dependencies
This seems like an easier problem - analyze the modules we are inheriting, and look through them for any module that could be pruned out. Unfortunately, as the above discussion notes, we can't just look for the classes and which package they are in, which <source> and <super-source> elements are unused - we also would need to look for <replace-with> and <generate-with> rules - consider something like com.google.gwt.user.RemoteService, which only adds a rule and some configuration details, or even com.google.gwt.user.RemoteServiceObfuscateTypeNames, which modifies only a single setting of the RemoteService module. If RemoteServiceObfuscateTypeNames were removed, everything would still compile, but now there might be information about your RPC classes compiled into your app that you don't expect to be there.
With these in mind, perhaps such a tool could watch all possible rebind rules in the current build, and all configuration settings, properties, etc, and see if any of those rules were not using during a gwt:compile process. Then, indicate which modules had unused parts, and if any module (and all of its inherited modules) were unused, it could shown to the user as able to be removed.
One more important piece: order matters, when defining <inherits> statements. If I add a inherits for com.google.gwt.logging.Logging, then follow it with com.google.gwt.logging.LoggingDisabled, logging classes will be on the source path and will compile, but will have no effect. But if those are ordered the other way around, then they will not only be on the source path, but will all be functional. So any analysis of modules used and unused would need to also include transitive inherits statements, and their orders.
GWT compiles the Java source into Javascript, and names the files according to a hash of their contents. I'm getting a new set of files every compile, because the javascript contents are changing, even when I don't change the source at all.
The files are different for OBF and PRETTY output, but if I set it to DETAILED, they're no longer different every compile. In PRETTY, I can see that all/most of the differences between compiles are in the value parameters for typeId. For example, a funciton called initValues() is called with different values for it's typeId parameter.
In PRETTY mode, the differences you see are allocation of Java Classes to TypeIds. It's how GWT manages run time type checking. You'll notice a table at the bottom of each script essentially mapping each typeId to all compatible superclasses. This is how GWT can still throw ClassCastException in JavaScript (though you should run into this very rarely!).
In OBF mode, the differences are due to the allocation of minified function names.
In both cases, it's due to the order the compiler is processing the code. Some internal symbol tables might be using a non-ordered collection store symbols for processing. It can happen for lots of reasons.
As far as I know, GWT will compile a new version every time you compile it, this is a feature ;)
You can use ant to control it though, so that it only builds the GWT section of your application if it's actually changed:
http://wiki.shiftyjelly.com/index.php/GWT#Use_The_Power_of_Ant_to_Build_Changes_Only