A parameterized AutoBean type containing a typed member - gwt

Question
Is there any way to deserialize JSON using the AutoBean framework such that the resulting bean has a type parameter that affects the type of one or more of its members?
Background
RPC with JSON results
I'm using GWT (RequestBuilder) to perform RPC requests. The JSON payload returned is of the following form:
{
"resultSet": [{...}, {...}, ...], // items requested; say, items 150-160
"totalCount": 15330 // total matching items in DB
}
The objects in resultSet vary in type depending on the specific RPC I'm calling.
AutoBean interface
I'd like to deserialize this JSON using AutoBean. I'm trying to represent this object as follows:
interface RpcResults<T> {
List<T> getResultSet();
void setResultSet(List<T> resultSet);
int getTotalCount();
void setTotalCount(int totalCount);
}
I've also created appropriate interfaces representing each type of object that could exist within resultSet. Finally, I set up the appropriate call to AutoBeanCodex.decode.
Running the code
Attempting to run this code in development mode causes the following stack trace to appear in the console:
19:44:23.791 [ERROR] [xcbackend] Uncaught exception escaped
java.lang.IllegalArgumentException: The AutoBeanFactory cannot create a java.lang.Object
at com.google.gwt.autobean.shared.AutoBeanCodex$Decoder.push(AutoBeanCodex.java:240)
at com.google.gwt.autobean.shared.AutoBeanCodex$Decoder.decode(AutoBeanCodex.java:50)
at com.google.gwt.autobean.shared.AutoBeanCodex$Decoder.visitCollectionProperty(AutoBeanCodex.java:83)
at com.citrix.xenclient.backend.client.json.RpcResultsAutoBean.traverseProperties(RpcResultsAutoBean.java:100)
at com.google.gwt.autobean.shared.impl.AbstractAutoBean.traverse(AbstractAutoBean.java:153)
at com.google.gwt.autobean.shared.impl.AbstractAutoBean.accept(AbstractAutoBean.java:112)
at com.google.gwt.autobean.shared.AutoBeanCodex$Decoder.decode(AutoBeanCodex.java:51)
at com.google.gwt.autobean.shared.AutoBeanCodex.decode(AutoBeanCodex.java:505)
at com.google.gwt.autobean.shared.AutoBeanCodex.decode(AutoBeanCodex.java:521)
at com.citrix.xenclient.backend.client.services.JSONResponseResultSetHandler.onResponseReceived(JSONResponseResultSetHandler.java:51)
at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:287)
at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:395)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:157)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessagesWhileWaitingForReturn(BrowserChannelServer.java:326)
at com.google.gwt.dev.shell.BrowserChannelServer.invokeJavascript(BrowserChannelServer.java:207)
at com.google.gwt.dev.shell.ModuleSpaceOOPHM.doInvoke(ModuleSpaceOOPHM.java:126)
at com.google.gwt.dev.shell.ModuleSpace.invokeNative(ModuleSpace.java:561)
at com.google.gwt.dev.shell.ModuleSpace.invokeNativeObject(ModuleSpace.java:269)
at com.google.gwt.dev.shell.JavaScriptHost.invokeNativeObject(JavaScriptHost.java:91)
at com.google.gwt.core.client.impl.Impl.apply(Impl.java)
at com.google.gwt.core.client.impl.Impl.entry0(Impl.java:214)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:157)
at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessages(BrowserChannelServer.java:281)
at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:531)
at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:352)
at java.lang.Thread.run(Thread.java:636)
Based on this stack trace, my hunch is the following:
Type erasure makes it seem that RpcResults.getResultSet() is returning a raw List.
The AutoBean deserialiser attempts to create Object instances for each item in resultSet.
Failure
Question again
Am I missing something in the AutoBean API that will allow me to do this easily? If not, is there an obvious point of attack I should look into? Is there a more sensible alternative for what I'm doing (other than JSONParser and JavaScriptObject, which I'm already using)?

This is not simple, due to Java type erasure. The type T does not exist at runtime, having been erased to Object in lieu of any other lower bound. The AutoBeanCodex requires type information in order to reify the elements of the incoming json payload. This type information is usually provided by the AutoBean implementation, but due to the T erasure, all it knows is that it contains a List<Object>.
If you can provide a class literal at runtime, the getter could be declared as Splittable getResultSet() and the individual elements of the list reified by calling AutoBeanCodex.decode(autoBeanFactory, SomeInterfaceType.class, getResultSet().get(index)). By using a Category, you could add a <T> T getResultAs(Class<T> clazz, int index) method to the AutoBean interface. This would look something like:
#Category(MyCategory.class)
interface MyFactory extends AutoBeanFactory {
AutoBean<ResultContainer> resultContainer();
}
interface ResultContainer<T> {
Splittable getResultSet();
// It's the class literal that makes it work
T getResultAs(Class<T> clazz, int index);
}
class MyCategory {
public static <T> T getResultAs(Autobean<ResultContainer> bean,
Class<T> clazz, int index) {
return AutoBeanCodex.decode(bean.getFactory(), clazz,
bean.as().getResultSet().get(index)).as();
}
}

Try overriding the .getResultSet() and .setResultSet() methods in your object-specific interfaces:
interface FooRpcResults extends RpcResults<Foo> {
#Override
List<Foo> getResultSet();
#Override
void setResultSet(List<Foo> value);
}
The following test works for me (GWT 2.3.0):
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNotNull;
import java.util.ArrayList;
import java.util.List;
import org.junit.Test;
import com.google.web.bindery.autobean.shared.AutoBean;
import com.google.web.bindery.autobean.shared.AutoBeanCodex;
import com.google.web.bindery.autobean.shared.AutoBeanFactory;
import com.google.web.bindery.autobean.shared.AutoBeanUtils;
import com.google.web.bindery.autobean.vm.AutoBeanFactorySource;
public class AutoBeanTest {
public static interface Page<T> {
int getDataSize();
List<T> getPage();
int getStartIndex();
void setDataSize(int value);
void setPage(List<T> value);
void setStartIndex(int value);
}
public static interface Thing {
String getName();
void setName(String value);
}
public static interface ThingFactory extends AutoBeanFactory {
AutoBean<Thing> createThing();
AutoBean<ThingPage> createThingPage();
}
public static interface ThingPage extends Page<Thing> {
#Override
List<Thing> getPage();
#Override
void setPage(List<Thing> value);
}
#Test
public void testAutoBean() {
final ThingFactory factory = AutoBeanFactorySource
.create(ThingFactory.class);
final Thing thing1 = factory.createThing().as();
thing1.setName("One");
final Thing thing2 = factory.createThing().as();
thing2.setName("Two");
final List<Thing> things = new ArrayList<Thing>();
things.add(thing1);
things.add(thing2);
final Page<Thing> page = factory.createThingPage().as();
page.setStartIndex(50);
page.setDataSize(1000);
page.setPage(things);
final String json = AutoBeanCodex.encode(
AutoBeanUtils.getAutoBean(page)).getPayload();
final Page<Thing> receivedPage = AutoBeanCodex.decode(factory,
ThingPage.class, json).as();
assertEquals(receivedPage.getStartIndex(), page.getStartIndex());
assertEquals(receivedPage.getDataSize(), page.getDataSize());
assertNotNull(receivedPage.getPage());
assertEquals(receivedPage.getPage().size(), page.getPage().size());
for (int i = 0; i < receivedPage.getPage().size(); i++) {
assertNotNull(receivedPage.getPage().get(i));
assertEquals(receivedPage.getPage().get(i).getName(), page
.getPage().get(i).getName());
}
}
}
Removing the overrides in the ThingPage interface will break it.

Related

ClassCastException whlile accessing custom object in Geode's Function.execute() method

I am adding the custom object(Account) into Cache and then trying to access the object in the Function.execute() method.
But it throws org.apache.geode.pdx.internal.PdxInstanceImpl cannot be cast to com.sas.cpm.model.Account.
Custom object Account.java
public class Account implements PdxSerializable, Declarable{
public Account() {
super();
// TODO Auto-generated constructor
}
#Override
public void fromData(PdxReader pr) {…..}
#Override
public void toData(PdxWriter pw) {… }
}
Client code:
ClientCache cache = new ClientCacheFactory()
.addPoolLocator("localhost", 10334).set("log-level", "INFO").create();
// create a local region that matches the server region
// Account is the domain object
Region<String, Account> region =
cache.<String, Account>createClientRegionFactory(ClientRegionShortcut.CACHING_PROXY)
.create("testRegion");
feedData(region); //add Account object to region
Execution execution = FunctionService.onRegion(region);
ResultCollector<Integer, List> rc = execution.execute("UpdateCost");//.ID);//com.sas.cpm.geode
Function class : UpdateCost.java
public class UpdateCost implements Function{
#Override
public void execute(FunctionContext context) {
RegionFunctionContext regionContext = (RegionFunctionContext) context;
Region<String, Account> region = regionContext.getDataSet();
for ( Map.Entry<String, Account> entry : region.entrySet() ) {
Account account = entry.getValue(); /// THIS LINE GIVES THE ERROR
}
}
}
Error:
Exception in thread "main" org.apache.geode.cache.execute.FunctionException: org.apache.geode.cache.client.ServerOperationException: remote server on dsinsbb01ina4(46560:loner):59343:2f7e3885: While performing a remote executeRegionFunction
at org.apache.geode.internal.cache.execute.ServerRegionFunctionExecutor.executeOnServer(ServerRegionFunctionExecutor.java:229)
at org.apache.geode.internal.cache.execute.ServerRegionFunctionExecutor.executeFunction(ServerRegionFunctionExecutor.java:178)
at org.apache.geode.internal.cache.execute.ServerRegionFunctionExecutor.execute(ServerRegionFunctionExecutor.java:379)
at geodeproject1.Example.funcUpdateExec(Example.java:186)
at geodeproject1.Example.main(Example.java:68)
Caused by: org.apache.geode.cache.client.ServerOperationException: remote server on dsinsbb01ina4(46560:loner):59343:2f7e3885: While performing a remote executeRegionFunction
at org.apache.geode.cache.client.internal.ExecuteRegionFunctionOp$ExecuteRegionFunctionOpImpl.processResponse(ExecuteRegionFunctionOp.java:606)
at org.apache.geode.cache.client.internal.AbstractOp.processResponse(AbstractOp.java:225)
at org.apache.geode.cache.client.internal.AbstractOp.attemptReadResponse(AbstractOp.java:198)
at org.apache.geode.cache.client.internal.AbstractOp.attempt(AbstractOp.java:386)
at org.apache.geode.cache.client.internal.ConnectionImpl.execute(ConnectionImpl.java:269)
at org.apache.geode.cache.client.internal.pooling.PooledConnection.execute(PooledConnection.java:325)
at org.apache.geode.cache.client.internal.OpExecutorImpl.executeWithPossibleReAuthentication(OpExecutorImpl.java:892)
at org.apache.geode.cache.client.internal.OpExecutorImpl.execute(OpExecutorImpl.java:171)
at org.apache.geode.cache.client.internal.PoolImpl.execute(PoolImpl.java:772)
at org.apache.geode.cache.client.internal.ExecuteRegionFunctionOp.execute(ExecuteRegionFunctionOp.java:162)
at org.apache.geode.cache.client.internal.ServerRegionProxy.executeFunction(ServerRegionProxy.java:732)
at org.apache.geode.internal.cache.execute.ServerRegionFunctionExecutor.executeOnServer(ServerRegionFunctionExecutor.java:220)
... 4 more
Caused by: org.apache.geode.cache.execute.FunctionException: java.lang.ClassCastException: org.apache.geode.pdx.internal.PdxInstanceImpl cannot be cast to com.sas.cpm.model.Account
at org.apache.geode.cache.client.internal.ExecuteRegionFunctionOp$ExecuteRegionFunctionOpImpl.processResponse(ExecuteRegionFunctionOp.java:583)
... 15 more
Caused by: java.lang.ClassCastException: org.apache.geode.pdx.internal.PdxInstanceImpl cannot be cast to com.sas.cpm.model.Account
at com.sas.cpm.geode.UpdateCost.execute(UpdateCost.java:49)
at org.apache.geode.internal.cache.execute.AbstractExecution.executeFunctionLocally(AbstractExecution.java:331)
at org.apache.geode.internal.cache.execute.AbstractExecution$2.run(AbstractExecution.java:300)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.geode.distributed.internal.ClusterDistributionManager.runUntilShutdown(ClusterDistributionManager.java:949)
at org.apache.geode.distributed.internal.ClusterDistributionManager.doFunctionExecutionThread(ClusterDistributionManager.java:803)
at org.apache.geode.internal.logging.LoggingThreadFactory.lambda$newThread$0(LoggingThreadFactory.java:121)
at java.lang.Thread.run(Unknown Source)
The ClassCastException is thrown because you're receiving a PdxInstance from the cache instead of an Account, this happens when you implement the PdxSerializable interface in your domain object and configure PDX Serialization with read-serialized=true.
Please have a look at Implementing PdxSerializable in Your Domain Object and Programming Your Application to Use PdxInstances for further details.
Cheers.

PowerMock with Jersey, InternalServerErrorException error

We have a gradle Jersey project and we're trying to use PowerMock and EasyMock (and JUnit) for the unit tests because we have several static methods we need to mock. We finally got PowerMock working with Jersey (see this question), but PowerMock will only allow our REST calls to return Strings. All other return types--like Lists, for example--result in an InternalServerErrorException. Full stack trace:
javax.ws.rs.InternalServerErrorException: HTTP 500 Internal Server Error
at org.glassfish.jersey.client.JerseyInvocation.convertToException(JerseyInvocation.java:1020)
at org.glassfish.jersey.client.JerseyInvocation.translate(JerseyInvocation.java:877)
at org.glassfish.jersey.client.JerseyInvocation.access$800(JerseyInvocation.java:92)
at org.glassfish.jersey.client.JerseyInvocation$3.call(JerseyInvocation.java:722)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:228)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:444)
at org.glassfish.jersey.client.JerseyInvocation.invoke(JerseyInvocation.java:718)
at org.glassfish.jersey.client.JerseyInvocation$Builder.method(JerseyInvocation.java:430)
at org.glassfish.jersey.client.JerseyInvocation$Builder.get(JerseyInvocation.java:321)
at com.company.project.sm.rest.PowerMockTest.aListTest(PowerMockTest.java:104)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:68)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.runTestMethod(PowerMockJUnit44RunnerDelegateImpl.java:326)
at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:89)
at org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:97)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.executeTest(PowerMockJUnit44RunnerDelegateImpl.java:310)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.executeTestInSuper(PowerMockJUnit47RunnerDelegateImpl.java:131)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.access$100(PowerMockJUnit47RunnerDelegateImpl.java:59)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner$TestExecutorStatement.evaluate(PowerMockJUnit47RunnerDelegateImpl.java:147)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.evaluateStatement(PowerMockJUnit47RunnerDelegateImpl.java:107)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.executeTest(PowerMockJUnit47RunnerDelegateImpl.java:82)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.runBeforesThenTestThenAfters(PowerMockJUnit44RunnerDelegateImpl.java:298)
at org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:87)
at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:50)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.invokeTestMethod(PowerMockJUnit44RunnerDelegateImpl.java:218)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.runMethods(PowerMockJUnit44RunnerDelegateImpl.java:160)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$1.run(PowerMockJUnit44RunnerDelegateImpl.java:134)
at org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:34)
at org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:44)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.run(PowerMockJUnit44RunnerDelegateImpl.java:136)
at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.run(JUnit4TestSuiteChunkerImpl.java:121)
at org.powermock.modules.junit4.common.internal.impl.AbstractCommonPowerMockRunner.run(AbstractCommonPowerMockRunner.java:57)
at org.powermock.modules.junit4.PowerMockRunner.run(PowerMockRunner.java:59)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206)
The REST calls in the unit tests work without PowerMock; they fail when run with PowerMock. We can't spot what we're doing wrong.
Here are examples of our unit tests (entire class):
// comment-out these two lines to turn off PowerMock
#org.powermock.core.classloader.annotations.PowerMockIgnore({"javax.ws.*", "org.glassfish.*"})
#RunWith(PowerMockRunner.class)
public class PowerMockTest {
private JerseyTest jerseyTest;
#Before
public void setUp() throws Exception {
this.jerseyTest = new JerseyTest() {
#Override
protected Application configure() {
return PowerMockTest.this.configure();
}
};
this.jerseyTest.setUp();
}
#After
public void tearDown() throws Exception {
this.jerseyTest.tearDown();
}
public ResourceConfig configure() {
return new ResourceConfig(PowerMockTestResource.class);
}
#Test
public void thisTestWorks() {
String fooString = this.jerseyTest.target("test/string")
.request(MediaType.TEXT_PLAIN)
.get( String.class );
assertNotNull( fooString );
}
#Test
public void thisTestFails() {
List<PowerMockStringWrapper> response = this.jerseyTest.target("test/list")
.request( MediaType.APPLICATION_JSON )
.get( new GenericType<List<PowerMockStringWrapper>>() {} );
assertNotNull( response );
}
}
The .get line in thisTestFails() is the line the stack trace reports is failing.
As you can see, we're not even doing anything PowerMock-ey in our tests. The PowerMockStringWrapper is just a small class that wraps String, because Jersey (for some reason) can't return Lists of base Strings. Once again, both these unit tests work without PowerMock, but the second one fails as soon as we turn PowerMock on.
Can anyone suggest anything?

AspectJ and Java8 - bad type on operand stack

Looking at This Eclipse Bug it seems the Java Verifier (since 1.6) has had issues with ApsectJ.
The bug says AspectJ 1.8.1 will fix the problem. But using that with Java8u11 I still get the verify error.
I'm running JUnit4 under STS 3.6.0 (Eclipse 4.4). I believe this configuration is the very latest available of all packages.
Completely replaced the remainder of text with requested example. This seems to be limited to #Around advice. #Before works fine.
JUnit:
package com.test.aspectjdemo.junit;
import static org.junit.Assert.*;
import org.junit.Test;
import com.test.aspectjdemo.domain.AspectTarget;
public class AspectTargetTest {
#Test
public void testFirstMethod() throws Throwable {
AspectTarget aspectTarget = new AspectTarget();
aspectTarget.firstMethod();
}
}
Vmarg: -javaagent:C:....m2\repository\org\aspectj\aspectjweaver\1.8.1\aspectjweaver-1.8.1.jar
Class under test (I had some problems because proceed apparently declares it throws Throwable, which makes sense, but this simple test didn't throw anything. So I added a faux exception make it compile :
package com.test.aspectjdemo.domain;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
public class AspectTarget {
final Logger logger = LogManager.getLogger();
int x = 1;
public void firstMethod() throws Throwable {
logger.info("Start First Method");
x = secondMethod(x);
logger.info("Exit X is {}", x);
}
private int secondMethod(int x) throws Throwable {
logger.info("input is {}", x++);
if (x==100)
throw new RuntimeException();
return new Integer(x);
}
}
The Aspect:
package com.test.aspectjdemo.aspects;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Before;
import org.aspectj.lang.annotation.Pointcut;
public aspect LoggingAspect {
static final Logger logger = LogManager.getLogger();
/**
* Exclude JUnit methods
*/
#Pointcut("!within(com.test.aspectjdemo.junit..*Test)")
public void noJunit() {}
#Pointcut("execution(* com.test.aspectjdemo.domain.*.*(..)) && noJunit()")
public void allMethods() { }
#Around("allMethods()")
public Object allmethods(ProceedingJoinPoint joinPoint) throws Throwable {
return joinPoint.proceed();
}
Finally once again the Error, which is actually thrown when the JUnit attempts to instantiate the AspectTarget (first line of testFirstMethod method).
java.lang.VerifyError: Bad type on operand stack
Exception Details:
Location:
com/test/aspectjdemo/domain/AspectTarget.secondMethod(I)I #23: invokestatic
Reason:
Type 'org/aspectj/lang/JoinPoint' (current frame, stack[2]) is not assignable to integer
Current Frame:
bci: #23
flags: { }
locals: { 'com/test/aspectjdemo/domain/AspectTarget', integer, integer, 'org/aspectj/lang/JoinPoint' }
stack: { 'com/test/aspectjdemo/domain/AspectTarget', integer, 'org/aspectj/lang/JoinPoint', 'com/test/aspectjdemo/aspects/LoggingAspect', null, 'org/aspectj/lang/JoinPoint' }
Bytecode:
0000000: 1b3d b200 4b2a 2a1c b800 51b8 0057 4e2a
0000010: 1c2d b800 6601 2db8 006a b800 6dac
at com.test.aspectjdemo.junit.AspectTargetTest.testFirstMethod(AspectTargetTest.java:13)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
The problem is that you mix native AspectJ syntax (public aspect LoggingAspect) with annotation-style #AspectJ syntax. I can reproduce the problem this way.
Correct #AspectJ syntax:
package com.test.aspectjdemo.aspects;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
#Aspect
public class LoggingAspect {
#Pointcut("!within(com.test.aspectjdemo.junit..*Test)")
public void excludeJUnit() {}
#Pointcut("execution(* com.test.aspectjdemo.domain.*.*(..)) && excludeJUnit()")
public void allMethods() {}
#Around("allMethods()")
public Object allmethods(ProceedingJoinPoint thisJoinPoint) throws Throwable {
System.out.println(thisJoinPoint);
return thisJoinPoint.proceed();
}
}
Correct native AspectJ syntax:
package com.test.aspectjdemo.aspects;
public aspect LoggingAspect {
pointcut excludeJUnit() :
!within(com.test.aspectjdemo.junit..*Test);
pointcut allMethods() :
execution(* com.test.aspectjdemo.domain.*.*(..)) && excludeJUnit();
Object around() : allMethods() {
System.out.println(thisJoinPoint);
return proceed();
}
}

Using GWT Editors with a complex usecase

I'm trying to create a page which is very similar to the Google Form creation page.
This is how I am attempting to model it using the GWT MVP framework (Places and Activities), and Editors.
CreateFormActivity (Activity and presenter)
CreateFormView (interface for view, with nested Presenter interface)
CreateFormViewImpl (implements CreateFormView and Editor< FormProxy >
CreateFormViewImpl has the following sub-editors:
TextBox title
TextBox description
QuestionListEditor questionList
QuestionListEditor implements IsEditor< ListEditor< QuestionProxy, QuestionEditor>>
QuestionEditor implements Editor < QuestionProxy>
QuestionEditor has the following sub-editors:
TextBox questionTitle
TextBox helpText
ValueListBox questionType
An optional subeditor for each question type below.
An editor for each question type:
TextQuestionEditor
ParagraphTextQuestionEditor
MultipleChoiceQuestionEditor
CheckboxesQuestionEditor
ListQuestionEditor
ScaleQuestionEditor
GridQuestionEditor
Specific Questions:
What is the correct way to add / remove questions from the form. (see follow up question)
How should I go about creating the Editor for each question type? I attempted to listen to the questionType value changes, I'm not sure what to do after. (answered by BobV)
Should each question-type-specific editor be wrapper with an optionalFieldEditor? Since only one of can be used at a time. (answered by BobV)
How to best manage creating/removing objects deep in the object hierarchy. Ex) Specifying answers for a question number 3 which is of type multiple choice question. (see follow up question)
Can OptionalFieldEditor editor be used to wrap a ListEditor? (answered by BobV)
Implementation based on Answer
The Question Editor
public class QuestionDataEditor extends Composite implements
CompositeEditor<QuestionDataProxy, QuestionDataProxy, Editor<QuestionDataProxy>>,
LeafValueEditor<QuestionDataProxy>, HasRequestContext<QuestionDataProxy> {
interface Binder extends UiBinder<Widget, QuestionDataEditor> {}
private CompositeEditor.EditorChain<QuestionDataProxy, Editor<QuestionDataProxy>> chain;
private QuestionBaseDataEditor subEditor = null;
private QuestionDataProxy currentValue = null;
#UiField
SimplePanel container;
#UiField(provided = true)
#Path("dataType")
ValueListBox<QuestionType> dataType = new ValueListBox<QuestionType>(new Renderer<QuestionType>() {
#Override
public String render(final QuestionType object) {
return object == null ? "" : object.toString();
}
#Override
public void render(final QuestionType object, final Appendable appendable) throws IOException {
if (object != null) {
appendable.append(object.toString());
}
}
});
private RequestContext ctx;
public QuestionDataEditor() {
initWidget(GWT.<Binder> create(Binder.class).createAndBindUi(this));
dataType.setValue(QuestionType.BooleanQuestionType, true);
dataType.setAcceptableValues(Arrays.asList(QuestionType.values()));
/*
* The type drop-down UI element is an implementation detail of the
* CompositeEditor. When a question type is selected, the editor will
* call EditorChain.attach() with an instance of a QuestionData subtype
* and the type-specific sub-Editor.
*/
dataType.addValueChangeHandler(new ValueChangeHandler<QuestionType>() {
#Override
public void onValueChange(final ValueChangeEvent<QuestionType> event) {
QuestionDataProxy value;
switch (event.getValue()) {
case MultiChoiceQuestionData:
value = ctx.create(QuestionMultiChoiceDataProxy.class);
setValue(value);
break;
case BooleanQuestionData:
default:
final QuestionNumberDataProxy value2 = ctx.create(BooleanQuestionDataProxy.class);
value2.setPrompt("this value doesn't show up");
setValue(value2);
break;
}
}
});
}
/*
* The only thing that calls createEditorForTraversal() is the PathCollector
* which is used by RequestFactoryEditorDriver.getPaths().
*
* My recommendation is to always return a trivial instance of your question
* type editor and know that you may have to amend the value returned by
* getPaths()
*/
#Override
public Editor<QuestionDataProxy> createEditorForTraversal() {
return new QuestionNumberDataEditor();
}
#Override
public void flush() {
//XXX this doesn't work, no data is returned
currentValue = chain.getValue(subEditor);
}
/**
* Returns an empty string because there is only ever one sub-editor used.
*/
#Override
public String getPathElement(final Editor<QuestionDataProxy> subEditor) {
return "";
}
#Override
public QuestionDataProxy getValue() {
return currentValue;
}
#Override
public void onPropertyChange(final String... paths) {
}
#Override
public void setDelegate(final EditorDelegate<QuestionDataProxy> delegate) {
}
#Override
public void setEditorChain(final EditorChain<QuestionDataProxy, Editor<QuestionDataProxy>> chain) {
this.chain = chain;
}
#Override
public void setRequestContext(final RequestContext ctx) {
this.ctx = ctx;
}
/*
* The implementation of CompositeEditor.setValue() just creates the
* type-specific sub-Editor and calls EditorChain.attach().
*/
#Override
public void setValue(final QuestionDataProxy value) {
// if (currentValue != null && value == null) {
chain.detach(subEditor);
// }
QuestionType type = null;
if (value instanceof QuestionMultiChoiceDataProxy) {
if (((QuestionMultiChoiceDataProxy) value).getCustomList() == null) {
((QuestionMultiChoiceDataProxy) value).setCustomList(new ArrayList<CustomListItemProxy>());
}
type = QuestionType.CustomList;
subEditor = new QuestionMultipleChoiceDataEditor();
} else {
type = QuestionType.BooleanQuestionType;
subEditor = new BooleanQuestionDataEditor();
}
subEditor.setRequestContext(ctx);
currentValue = value;
container.clear();
if (value != null) {
dataType.setValue(type, false);
container.add(subEditor);
chain.attach(value, subEditor);
}
}
}
Question Base Data Editor
public interface QuestionBaseDataEditor extends HasRequestContext<QuestionDataProxy>, IsWidget {
}
Example Subtype
public class BooleanQuestionDataEditor extends Composite implements QuestionBaseDataEditor {
interface Binder extends UiBinder<Widget, BooleanQuestionDataEditor> {}
#Path("prompt")
#UiField
TextBox prompt = new TextBox();
public QuestionNumberDataEditor() {
initWidget(GWT.<Binder> create(Binder.class).createAndBindUi(this));
}
#Override
public void setRequestContext(final RequestContext ctx) {
}
}
The only issue left is that QuestionData subtype specific data isn't being displayed, or flushed. I think it has to do with the Editor setup I'm using.
For example, The value for prompt in the BooleanQuestionDataEditor is neither set nor flushed, and is null in the rpc payload.
My guess is: Since the QuestionDataEditor implements LeafValueEditor, the driver will not visit the subeditor, even though it has been attached.
Big thanks to anyone who can help!!!
Fundamentally, you want a CompositeEditor to handle cases where objects are dynamically added or removed from the Editor hierarchy. The ListEditor and OptionalFieldEditor adaptors implement CompositeEditor.
If the information required for the different types of questions is fundamentally orthogonal, then multiple OptionalFieldEditor could be used with different fields, one for each question type. This will work when you have only a few question types, but won't really scale well in the future.
A different approach, that will scale better would be to use a custom implementation of a CompositeEditor + LeafValueEditor that handles a polymorphic QuestionData type hierarchy. The type drop-down UI element would become an implementation detail of the CompositeEditor. When a question type is selected, the editor will call EditorChain.attach() with an instance of a QuestionData subtype and the type-specific sub-Editor. The newly-created QuestionData instance should be retained to implement LeafValueEditor.getValue(). The implementation of CompositeEditor.setValue() just creates the type-specific sub-Editor and calls EditorChain.attach().
FWIW, OptionalFieldEditor can be used with ListEditor or any other editor type.
We implemented similar approach (see accepted answer) and it works for us like this.
Since driver is initially unaware of simple editor paths that might be used by sub-editors, every sub-editor has own driver:
public interface CreatesEditorDriver<T> {
RequestFactoryEditorDriver<T, ? extends Editor<T>> createDriver();
}
public interface RequestFactoryEditor<T> extends CreatesEditorDriver<T>, Editor<T> {
}
Then we use the following editor adapter that would allow any sub-editor that implements RequestFactoryEditor to be used. This is our workaround to support polimorphism in editors:
public static class DynamicEditor<T>
implements LeafValueEditor<T>, CompositeEditor<T, T, RequestFactoryEditor<T>>, HasRequestContext<T> {
private RequestFactoryEditorDriver<T, ? extends Editor<T>> subdriver;
private RequestFactoryEditor<T> subeditor;
private T value;
private EditorDelegate<T> delegate;
private RequestContext ctx;
public static <T> DynamicEditor<T> of(RequestFactoryEditor<T> subeditor) {
return new DynamicEditor<T>(subeditor);
}
protected DynamicEditor(RequestFactoryEditor<T> subeditor) {
this.subeditor = subeditor;
}
#Override
public void setValue(T value) {
this.value = value;
subdriver = null;
if (null != value) {
RequestFactoryEditorDriver<T, ? extends Editor<T>> newSubdriver = subeditor.createDriver();
if (null != ctx) {
newSubdriver.edit(value, ctx);
} else {
newSubdriver.display(value);
}
subdriver = newSubdriver;
}
}
#Override
public T getValue() {
return value;
}
#Override
public void flush() {
if (null != subdriver) {
subdriver.flush();
}
}
#Override
public void onPropertyChange(String... paths) {
}
#Override
public void setDelegate(EditorDelegate<T> delegate) {
this.delegate = delegate;
}
#Override
public RequestFactoryEditor<T> createEditorForTraversal() {
return subeditor;
}
#Override
public String getPathElement(RequestFactoryEditor<T> subEditor) {
return delegate.getPath();
}
#Override
public void setEditorChain(EditorChain<T, RequestFactoryEditor<T>> chain) {
}
#Override
public void setRequestContext(RequestContext ctx) {
this.ctx = ctx;
}
}
Our example sub-editor:
public static class VirtualProductEditor implements RequestFactoryEditor<ProductProxy> {
interface Driver extends RequestFactoryEditorDriver<ProductProxy, VirtualProductEditor> {}
private static final Driver driver = GWT.create(Driver.class);
public Driver createDriver() {
driver.initialize(this);
return driver;
}
...
}
Our usage example:
#Path("")
DynamicEditor<ProductProxy> productDetailsEditor;
...
public void setProductType(ProductType type){
if (ProductType.VIRTUAL==type){
productDetailsEditor = DynamicEditor.of(new VirtualProductEditor());
} else if (ProductType.PHYSICAL==type){
productDetailsEditor = DynamicEditor.of(new PhysicalProductEditor());
}
}
Would be great to hear your comments.
Regarding your question why subtype specific data isn't displayed or flushed:
My scenario is a little bit different but I made the following observation:
GWT editor databinding does not work as one would expect with abstract editors in the editor hierarchy. The subEditor declared in your QuestionDataEditor is of type QuestionBaseDataEditor and this is fully abstract type (an interface). When looking for fields/sub editors to populate with data/flush GWT takes all the fields declared in this type. Since QuestionBaseDataEditor has no sub editors declared nothing is displayed/flushed. From debugging I found out that is happens due to GWT using a generated EditorDelegate for that abstract type rather than the EditorDelegate for the concrete subtype present at that moment.
In my case all the concrete sub editors had the same types of leaf value editors (I had two different concrete editors one to display and one to edit the same bean type) so I could do something like this to work around this limitation:
interface MyAbstractEditor1 extends Editor<MyBean>
{
LeafValueEditor<String> description();
}
// or as an alternative
abstract class MyAbstractEditor2 implements Editor<MyBean>
{
#UiField protected LeafValueEditor<String> name;
}
class MyConcreteEditor extends MyAbstractEditor2 implements MyAbstractEditor1
{
#UiField TextBox description;
public LeafValueEditor<String> description()
{
return description;
}
// super.name is bound to a TextBox using UiBinder :)
}
Now GWT finds the subeditors in the abstract base class and in both cases I get the corresponding fields name and description populated and flushed.
Unfortunately this approach is not suitable when the concrete subeditors have different values in your bean structure to edit :(
I think this is a bug of the editors framework GWT code generation, that can only be solved by the GWT development team.
Isn't the fundamental problem that the binding happens at compile time so will only bind to QuestionDataProxy so won't have sub-type specific bindings? The CompositeEditor javadoc says "An interface that indicates that a given Editor is composed of an unknown number of sub-Editors all of the same type" so that rules this usage out?
At my current job I'm pushing to avoid polymorphism altogether as the RDBMS doesn't support it either. Sadly we do have some at the moment so I'm experimenting with a dummy wrapper class that exposes all the sub-types with specific getters so the compiler has something to work on. Not pretty though.
Have you seen this post: http://markmail.org/message/u2cff3mfbiboeejr this seems along the right lines.
I'm a bit worried about code bloat though.
Hope that makes some sort of sense!

How to conditionally serialize a field (attribute) using XStream

I am using XStream for serializing and de-serializing an object. For example, a class named Rating is defined as follows:
Public Class Rating {
String id;
int score;
int confidence;
// constructors here...
}
However, in this class, the variable confidence is optional.
So, when the confidence value is known (not 0), an XML representation of a Rating object should look like:
<rating>
<id>0123</id>
<score>5</score>
<confidence>10</confidence>
</rating>
However, when the confidence is unknown (the default value will be 0), the confidence
attribute should be omitted from the XML representation:
<rating>
<id>0123</id>
<score>5</score>
</rating>
Could anyone tell me how to conditionally serialize a field using XStream?
One option is to write a converter.
Here's one that I quickly wrote for you:
import com.thoughtworks.xstream.converters.Converter;
import com.thoughtworks.xstream.converters.MarshallingContext;
import com.thoughtworks.xstream.converters.UnmarshallingContext;
import com.thoughtworks.xstream.io.HierarchicalStreamReader;
import com.thoughtworks.xstream.io.HierarchicalStreamWriter;
public class RatingConverter implements Converter
{
#Override
public boolean canConvert(Class clazz) {
return clazz.equals(Rating.class);
}
#Override
public void marshal(Object value, HierarchicalStreamWriter writer,
MarshallingContext context)
{
Rating rating = (Rating) value;
// Write id
writer.startNode("id");
writer.setValue(rating.getId());
writer.endNode();
// Write score
writer.startNode("score");
writer.setValue(Integer.toString(rating.getScore()));
writer.endNode();
// Write confidence
if(rating.getConfidence() != 0)
{
writer.startNode("confidence");
writer.setValue(Integer.toString(rating.getConfidence()));
writer.endNode();
}
}
#Override
public Object unmarshal(HierarchicalStreamReader arg0,
UnmarshallingContext arg1)
{
return null;
}
}
All that's left for you to do is to register the converter, and provide accessor methods (i.e. getId, getScore, getConfidence) in your Rating class.
Note: your other option would be to omit the field appropriately.