FXML in obfuscated JavaFX JAR can't find Controller class - deployment

I am obfuscating my javafx application jar using Proguard 4.8 and obfuscated jar is showing some classes missing into the jar. The missing classes are all the controller classes of FXML. What's wrong with this step in build.xml that Obfuscates the application JAR with additional two utilities jar?
<target name="Ofuscating" depends="CreatingJars" >
<taskdef resource="proguard/ant/task.properties" classpath="${WorkingFolder}/proguard.jar" />
<proguard>
-dontusemixedcaseclassnames
-printmapping proguard.map
-dontshrink
-dontoptimize
-dontskipnonpubliclibraryclasses
-dontskipnonpubliclibraryclassmembers
<!--flattenpackagehierarchy ''-->
-libraryjars "${java.home}/lib/rt.jar"
-libraryjars "${java.home}/lib/javaws.jar"
-libraryjars "${env.JAVA_HOME}/lib/ant-javafx.jar"
-libraryjars "${env.JREFX_HOME}/lib/jfxrt.jar"
-libraryjars ${WorkingFolder}/libs/BareBonesBrowserLaunch.jar
:
:
-libraryjars ${WorkingFolder}/CustomJars/Lib.jar
-injars ${WorkingFolder}/${app.name}.jar
-injars ${WorkingFolder}/CustomJars/Verifier.jar(!META-INF/MANIFEST.MF)
-injars ${WorkingFolder}/CustomJars/Utility.jar(!META-INF/MANIFEST.MF)
-outjars ${WorkingFolder}/Obfuscated.jar
-ignorewarnings
-keepattributes Exceptions,InnerClasses,Signature,Deprecated,SourceFile,LineNumberTable,LocalVariable*Table,*Annotation*,Synthetic,EnclosingMethod
-adaptresourcefilecontents **.fxml,**.properties,META-INF/MANIFEST.MF,images/*.jar,publicCerts.store,production.version
-keepclassmembernames class * {
#javafx.fxml.FXML *;
}
-keepclasseswithmembers public class com.javafx.main.Main, com.product.main.EntryFX, net.license.LicenseEntryPoint {
public *; public static *;
}
-keep class * extends org.xml.sax.helpers.DefaultHandler
-keepclassmembers class * extends org.xml.sax.helpers.DefaultHandler {
private *;
public *;
}
</proguard>
</target>

.fxml files have strings with exact names of the controller class.
<Scene width="550" height="550"
fx:controller="fxmltableview.FXMLTableViewController"
xmlns:fx="http://javafx.com/fxml">
You need to either exclude controllers from obfuscating or find out their new names and update .fxml files in obfuscated jar.

Related

Use Proguard for Scala AWS Lambda

I have a question regarding the usage of proguard together with a scala aws lambda function. I have created a very simple aws lambda function like this:
package example
import scala.collection.JavaConverters._
import com.amazonaws.services.lambda.runtime.events.S3Event
import com.amazonaws.services.lambda.runtime.Context
object Main extends App {
def kinesisEventHandler(event: S3Event, context: Context): Unit = {
val result = event.getRecords.asScala.map(m => m.getS3.getObject.getKey)
println(result)
}
}
I have imported the following packages:
"com.amazonaws" % "aws-lambda-java-core" % "1.1.0"
"com.amazonaws" % "aws-lambda-java-events" % "1.3.0"
When I create a fat jar it is 13 MB in size and works like expected as an AWS Lambda function (only for test output).
13 MB is very big and so I tried proguard to shrink the jar, but it isn't working and I always get problems and after two days, I have no more ideas how to solve that.
Here is my proguard configuration:
-injars "/Users/x/x/x/AWS_Lambda/target/scala-2.12/lambda-demo-assembly-1.0.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/org.scala-lang/scala-library/scala-library-2.12.1.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-lambda-java-core/aws-lambda-java-core-1.1.0.jar"
-libraryjars "/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre/lib/rt.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-java-sdk-s3/aws-java-sdk-s3-1.11.0.jar"
-libraryjars "/Users/x/x/x/AWS_Lambda/lib_managed/jars/com.amazonaws/aws-lambda-java-events/aws-lambda-java-events-1.3.0.jar"
-outjars "/Users/x/x/x/AWS_Lambda/target/scala-2.12/proguard/lambda-demo_2.12-1.0.jar"
-dontoptimize
-dontobfuscate
-dontnote
-dontwarn
-keepattributes SourceFile,LineNumberTable
# Preserve all annotations.
-keepattributes *Annotation*
# Preserve all public applications.
-keepclasseswithmembers public class * {
public static void main(java.lang.String[]);
}
# Preserve some classes and class members that are accessed by means of
# introspection.
-keep class * implements org.xml.sax.EntityResolver
-keepclassmembers class * {
** MODULE$;
}
-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinPool {
long eventCount;
int workerCounts;
int runControl;
scala.concurrent.forkjoin.ForkJoinPool$WaitQueueNode syncStack;
scala.concurrent.forkjoin.ForkJoinPool$WaitQueueNode spareStack;
}
-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinWorkerThread {
int base;
int sp;
int runState;
}
-keepclassmembernames class scala.concurrent.forkjoin.ForkJoinTask {
int status;
}
-keepclassmembernames class scala.concurrent.forkjoin.LinkedTransferQueue {
scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference head;
scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference tail;
scala.concurrent.forkjoin.LinkedTransferQueue$PaddedAtomicReference cleanMe;
}
# Preserve some classes and class members that are accessed by means of
# introspection in the Scala compiler library, if it is processed as well.
#-keep class * implements jline.Completor
#-keep class * implements jline.Terminal
#-keep class scala.tools.nsc.Global
#-keepclasseswithmembers class * {
# <init>(scala.tools.nsc.Global);
#}
#-keepclassmembers class * {
# *** scala_repl_value();
# *** scala_repl_result();
#}
# Preserve all native method names and the names of their classes.
-keepclasseswithmembernames,includedescriptorclasses class * {
native <methods>;
}
# Preserve the special static methods that are required in all enumeration
# classes.
-keepclassmembers,allowoptimization enum * {
public static **[] values();
public static ** valueOf(java.lang.String);
}
# Explicitly preserve all serialization members. The Serializable interface
# is only a marker interface, so it wouldn't save them.
# You can comment this out if your application doesn't use serialization.
# If your code contains serializable classes that have to be backward
# compatible, please refer to the manual.
-keepclassmembers class * implements java.io.Serializable {
static final long serialVersionUID;
static final java.io.ObjectStreamField[] serialPersistentFields;
private void writeObject(java.io.ObjectOutputStream);
private void readObject(java.io.ObjectInputStream);
java.lang.Object writeReplace();
java.lang.Object readResolve();
}
# Your application may contain more items that need to be preserved;
# typically classes that are dynamically created using Class.forName:
# -keep public class mypackage.MyClass
# -keep public interface mypackage.MyInterface
# -keep public class * implements mypackage.MyInterface
-keep,includedescriptorclasses class example.** { *; }
-keepclassmembers class * {
<init>(...);
}
When I run this my jar is very small (around 5 MB), but when I launch the lambda I get the following error
"errorMessage": "java.lang.NoSuchMethodException: com.amazonaws.services.s3.event.S3EventNotification.parseJson(java.lang.String)",
"errorType": "lambdainternal.util.ReflectUtil$ReflectException"
I had a look at the class and proguard deleted this function. When I changed the config to also keep this file, I get another problem in another file.
Does somebody has already used proguard with a scala AWS lambda function and has a good setting or knows about this problem? Is there any other good solution to shrink the jar size?
Best,
Lothium
Honestly, 13MB isn't that big. But, as much as I'm sure that this is going to be considered heresy to a Scala developer, I created an equivalent method in Java and it's a bit over 7MB. I didn't try to use Proguard on it - it may shrink further.
That was with the S3Event package as you're using. If you look at what gets included because of that package it brings in tons of extra stuff - SQS, SNS, Dynamo and so on. Ultimately that is the biggest part. I did a little test to try to eliminate all libraries except for aws-lambda-java-core and instead used JsonPath. That got my jar file to 458K.
My code is below. I know it's not Scala but perhaps you can get some ideas from it. The key was eliminating as many AWS libraries as possible. Of course, if you want to do anything more than print keys in your Lambda you'll need to bring in more AWS libraries which, again, makes the size about 7MB.
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.List;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import com.jayway.jsonpath.JsonPath;
public class S3EventLambdaHandler implements RequestStreamHandler {
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) {
try {
List<String> keys = JsonPath.read(inputStream, "$.Records[*].s3.object.key");
for( String nextKey: keys )
System.out.println(nextKey);
}
catch( IOException ioe ) {
context.getLogger().log("caught IOException reading input stream");
}
}
}

Dagger 2 module dependency graph: bound multiple times

I'm new to Dagger 2, trying to port a (quite) complex application to it.
We have several dependencies on 'common' libraries (shared with other projects). Those 'common' libraries sometimes depend on other 'common' libraries. Each library exposes a module.
An example:
#Module
public class JsonModule {
#Provides
public Mapper provideMapper(ObjectMapper objectMapper) {
return new DefaultMapper(objectMapper);
}
#Provides
public ObjectMapper provideObjectMapper() {
return ObjectMapperFactory.build();
}
}
Our HttpModule depends on the JsonModule:
#Module(includes = {JsonModule.class})
public class HttpModule {
public HttpHelper provideHttpHelper(ObjectMapper objectMapper) {
return new DefaultHttpHelper(objectMapper);
}
}
Finally in my application, I depend on both these modules:
#Module(includes = {JsonModule.class, HttpModule.class})
public class MyAppModule {
public Service1 provideService1(ObjectMapper objectMapper) {
return new DefaultService1(objectMapper);
}
public Service2 provideService2(Mapper mappper) {
return new DefaultService2(mappper);
}
}
I then have 1 component that depends on my MyAppModule:
#Component(modules = MyAppModule.class)
#Singleton
public interface MyAppComponent {
public Service2 service2();
}
Unfortunately, when I compile the project, I get a Dagger compiler error:
[ERROR] com.company.json.Mapper is bound multiple times:
[ERROR] #Provides com.company.json.Mapper com.company.json.JsonModule.provideMapper(com.company.json.ObjectMapper)
[ERROR] #Provides com.company.json.Mapper com.company.json.JsonModule.provideMapper(com.company.json.ObjectMapper)
What am I doing wrong? Is it wrong to depend on a module twice in the same dependency graph?
In your MyAppModule you shouldn't include JsonModule, it is included by dagger implicitly. When including HttpModule dagger will automatically include all modules which HttpModule includes (in your case that is JsonModule).
It seems like the problem is related to our project's situation:
the common projects combine Groovy and Java
the common projects are built using Gradle
the application project combines Groovy and Java
the application project was built using Maven and the groovy-eclipse-compiler
Basicly: I blame the groovy-eclipse-compiler for now. I converted the project to Gradle, and everything works now.

Powermockito mock operation on hierarchy level class

My app has hierarchy level class objects as follows.
Package com.sample.folder1;
Public class ParentClass{ }
Package com.sample.folder2;
Public class Childclass1 extends ParentClass{ }
Unit piece under test package:
#Test
Public void testMocking{
Childclass1 obj = Powermockito.mock(Chilclass1.class);
}
When I execute above junit in eclipse it throws
"VerifyError: Inconsistent stackmap frames....."
Please suggest mocking on hierarchy classes on same and different packages.
Try to add -noverify (Java 8) or -XX:-UseSplitVerifier (Java 7) as vm parameter.

Can I inject spring bean from JAR file?

I use Spring 3 in my project.Then I face a problem when I inject spring bean from JAR file. In JAR file, there is class like;
package test;
#Service("CommonService")
public class CommonService {
}
And i already used it like this;
package com.java.test.app;
#Service(value = "OtherService")
public class OtherService {
#Resource(name = "CommonService")
private CommonService service;
}
In my spring-beans.xml;
<context:component-scan base-package="com.java.test.app, test">
<context:exclude-filter type="annotation" expression="org.springframework.stereotype.Repository"/>
</context:component-scan>
But #Resource annotation doesn't work.Can I inject spring bean from JAR file?
If at runtime your CommonService class is on the classpath and is within the base package you specify with component-scan, then you should be good to go. Try using #Autowired instead of #Resource.

open a specific eclipse project from command line

I work with many small, but unrelated java projects. I made an Ant script that creates the .project and .classpath automatically whenever I create a new project, with the needed libraries and project name. I would like to be able to open Eclipse with that project, from the command line. Right now I do it manually, by closing the old open project in the workspace, then I do an Import and find the new project. I could not find a way to do this from either Ant or batch. I can open Eclipse, but it comes up with the last workspace/project. I don;t mind if I would have to create an individual worspace/project, but i don't know how to do that from a script. Thank you for any suggestions.
I would recommend against doing this as it is not really that much effort to import the project using the standard wizards. I'd focus on closing the inactive projects (see more below).
Edit: If you are dead set on using ant to bring the projects into the workspace, you can implement a plugin doing something like the code below.
Do you close the old projects or delete them? I don't see any reason to actually delete them. If you close all projects you aren't working on (right click on them and select close project or select the project you do want and right click->close unrelated projects), they are ignored by the platform so won't impact development of the open project.
To hide the closed projects from the view, you can click the downwards pointing triangle in the top right corner of the Package Explorer view, select Filters... and in the Select the elements to exclude from the view: list check the Closed projects option.
This is a plugin that will read a set of names from a file in the workspace root, delete all existing projects (without removing the contents) and create the new projects in the workspace. Use is at your own risk, no liability blah blah.
Take the contents and put them in the relevant files and you can package an Eclipse plugin. I'd recommend using a separate Eclipse install (actually I recommend against using it at all) as it will run every time it finds the newprojects.txt in the workspace root.
The declaration in the plugin.xml implements an Eclipse extension point that is called after the workbench initializes. The earlyStartup() method of the StartupHelper is called. It creates a new Runnable that is executed asynchronously (this means the workspace loading won't block if this plugin has issues). The Runnable reads lines from the magic newprojects.txt file it expects to see in the workspace root. If it finds any contents it will delete/create the projects.
Update:
The helper has been modified to allow for projects to be created outside the workspace, if you define a value in newprojects.txt it is assumed that is the absolute URI of the project. Note that it doesn't escape the string, so if you are on a windows platform, use double slashes on the path.
Example contents:
#will be created in the workspace
project1
#will be created at c:\test\project2
project2=c:\\test\project2
Good luck!
/META-INF/MANIFEST.MF:
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Project fettling Plug-in
Bundle-SymbolicName: name.seller.rich;singleton:=true
Bundle-Version: 1.0.0
Bundle-Activator: name.seller.rich.Activator
Require-Bundle: org.eclipse.core.runtime,
org.eclipse.ui.workbench;bundle-version="3.4.1",
org.eclipse.swt;bundle-version="3.4.1",
org.eclipse.core.resources;bundle-version="3.4.1"
Bundle-ActivationPolicy: lazy
/plugin.xml:
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.0"?>
<plugin>
<extension
point="org.eclipse.ui.startup">
<startup class="name.seller.rich.projectloader.StartupHelper"/>
</extension>
</plugin>
/.project:
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>name.seller.rich.projectloader</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.pde.ManifestBuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.pde.SchemaBuilder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.pde.PluginNature</nature>
<nature>org.eclipse.jdt.core.javanature</nature>
</natures>
</projectDescription>
/.classpath:
<?xml version="1.0" encoding="UTF-8"?>
<classpath>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
<classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>
<classpathentry kind="src" path="src/main/java"/>
<classpathentry kind="output" path="target/classes"/>
</classpath>
/src/main/java/name/seller/rich/Activator.java:
package name.seller.rich;
import org.eclipse.core.runtime.Plugin;
import org.osgi.framework.BundleContext;
/**
* The activator class controls the plug-in life cycle
*/
public class Activator extends Plugin {
// The plug-in ID
public static final String PLUGIN_ID = "name.seller.rich";
// The shared instance
private static Activator plugin;
/**
* Returns the shared instance
*
* #return the shared instance
*/
public static Activator getDefault() {
return plugin;
}
/**
* The constructor
*/
public Activator() {
}
#Override
public void start(final BundleContext context) throws Exception {
super.start(context);
plugin = this;
}
#Override
public void stop(final BundleContext context) throws Exception {
plugin = null;
super.stop(context);
}
}
/src/main/java/name/seller/rich/projectloader/StartupHelper .java:
package name.seller.rich.projectloader;
import java.io.File;
import java.io.FileInputStream;
import java.util.Map;
import java.util.Properties;
import name.seller.rich.Activator;
import org.eclipse.core.internal.resources.ProjectDescription;
import org.eclipse.core.resources.IProject;
import org.eclipse.core.resources.IWorkspaceRoot;
import org.eclipse.core.resources.ResourcesPlugin;
import org.eclipse.core.runtime.IPath;
import org.eclipse.core.runtime.IProgressMonitor;
import org.eclipse.core.runtime.IStatus;
import org.eclipse.core.runtime.NullProgressMonitor;
import org.eclipse.core.runtime.Path;
import org.eclipse.core.runtime.Status;
import org.eclipse.ui.IStartup;
import org.eclipse.ui.IWorkbench;
import org.eclipse.ui.PlatformUI;
public class StartupHelper implements IStartup {
private static final class DirtyHookRunnable implements Runnable {
private IWorkspaceRoot workspaceRoot;
private DirtyHookRunnable(final IWorkspaceRoot workspaceRoot) {
this.workspaceRoot = workspaceRoot;
}
public void run() {
try {
IPath workspaceLocation = this.workspaceRoot.getLocation();
File startupFile = new File(workspaceLocation.toOSString(),
"newprojects.txt");
IProgressMonitor monitor = new NullProgressMonitor();
Properties properties = new Properties();
if (startupFile.exists()) {
properties.load(new FileInputStream(startupFile));
}
if (properties.size() > 0) {
// delete existing projects
IProject[] projects = this.workspaceRoot.getProjects();
for (IProject project : projects) {
// don't delete the content
project.delete(false, true, monitor);
}
// create new projects
for (Map.Entry entry : properties.entrySet()) {
IProject project = this.workspaceRoot
.getProject((String) entry.getKey());
// insert into loop
ProjectDescription projectDescription = new ProjectDescription();
projectDescription.setName((String) entry.getKey());
String location = (String) entry.getValue();
// value will be empty String if no "=" on the line
// in that case it will be created in the workspace
// WARNING, currently windows paths must be escaped,
// e.g. c:\\test\\myproject
if (location.length() > 0) {
IPath locationPath = new Path(location);
projectDescription.setLocation(locationPath);
}
project.create(projectDescription, monitor);
// project.create(monitor);
project.open(monitor);
}
}
} catch (Exception e) {
IStatus status = new Status(IStatus.INFO, Activator.PLUGIN_ID,
0, "unable to load new projects", null);
Activator.getDefault().getLog().log(status);
}
}
}
public StartupHelper() {
super();
}
public final void earlyStartup() {
IWorkbench workbench = PlatformUI.getWorkbench();
IWorkspaceRoot workspaceRoot = ResourcesPlugin.getWorkspace().getRoot();
workbench.getDisplay().asyncExec(new DirtyHookRunnable(workspaceRoot));
}
}
Another possible option is given on this question. The essence of the answer is, if you have CDT installed, you can do:
eclipse -nosplash
-application org.eclipse.cdt.managedbuilder.core.headlessbuild
-import {[uri:/]/path/to/project}
-importAll {[uri:/]/path/to/projectTreeURI} Import all projects under URI
-build {project_name | all}
-cleanBuild {projec_name | all}
The trick here is that it can import any project, not only C projects.
Partial solution: Open eclipse in a specified workspace:
eclipse.exe -data c:\code\workspace-name
The full list of options is helpful as #Paul Wagland indicated, but I couldn't get it to work until I used this command which included the -data option:
user#devmachine:~/dev$ eclipse -nosplash -data /home/$USER/dev -application org.eclipse.cdt.managedbuilder.core.headlessbuild -import /home/$USER/dev/src_dir