I'm trying to migrate app from jboss 5.1 to 7.1 and i have error like this Error message which i'm not sure why i get this. if anyone have any idea please help me.
Update: 1
#Stateless
#Remote(PackageService.class)
#Interceptors(CrossContextSpringBeanAutowiringInterceptor.class)
#WebContext(contextRoot="/appname_web_services", urlPattern="/MaintenanceService", authMethod="", secureWSDLAccess=false)
#WebService(
name="MaintenanceService",
targetNamespace = "http://appname.com/web/services",
serviceName = "MaintenanceService")
#SOAPBinding(parameterStyle = SOAPBinding.ParameterStyle.WRAPPED)
#HandlerChain(file = "WebServiceHandlerChains.xml")
#TransactionTimeout(10800)
public class MaintenanceServiceBean implements MaintenanceService {
private static final Logger logger = Logger.getLogger( MaintenanceServiceBean.class );
#Resource(mappedName="/ConnectionFactory")
ConnectionFactory connectionFactory;
#Resource(mappedName="topic/manager_system_topic")
javax.jms.Destination systemTopic;
#Autowired
MaintenanceService MigrationService;
#WebMethod
public List<Long> getSoftDeletedPackageIds(Long performedBy) throws Exception {
return MigrationService.getSoftDeletedPackageIds(null);
}
this is the class where i believe it fails.
You are using an interface in your JAXB mappings for which you have not provided enough information to the runtime for it too be able to bind an actual implementation. Without more code included in your question it's hard to recommend a specific solution, but typically you would annotate the included interface with #XmlAnyElement.
You can read through this useful tutorial to determine the best solution for your possible case.
Related
I'm trying to achieve to connect to two different MongoDBs with Spring (1.5.2. --> we included Spring in an internal Framework therefore it is not the latest version yet) and this already works partially but not fully. More precisely I found a strange behavior which I will describe below after showing my setup.
So this is what I done so far:
Project structure
backend
config
domain
customer
internal
repository
customer
internal
service
In configI have my Mongoconfigurations.
I created one base class which extends AbstractMongoConfiguration. This class holds fields for database, host etc. which are filled with the properties from a application.yml. It also holds a couple of methods for creating MongoClient and SimpleMongoDbFactory.
Furthermore there are two custom configuration classes. For each MongoDB one config. Both extend the base class.
Here is how they are coded:
Primary Connection
#Primary
#EntityScan(basePackages = "backend.domain.customer")
#Configuration
#EnableMongoRepositories(
basePackages = {"backend.repository.customer"},
mongoTemplateRef = "customerDataMongoTemplate")
#ConfigurationProperties(prefix = "customer.mongodb")
public class CustomerDataMongoConnection extends BaseMongoConfig{
public static final String TEMPLATE_NAME = "customerDataMongoTemplate";
#Override
#Bean(name = CustomerDataMongoConnection.TEMPLATE_NAME)
public MongoTemplate mongoTemplate() {
MongoClient client = getMongoClient(getAddress(),
getCredentials());
SimpleMongoDbFactory factory = getSimpleMongoDbFactory(client,
getDatabaseName());
return new MongoTemplate(factory);
}
}
The second configuration class looks pretty similar. Here it is:
#EntityScan(basePackages = "backend.domain.internal")
#Configuration
#EnableMongoRepositories(
basePackages = {"backend.repository.internal"}
mongoTemplateRef = InternalDataMongoConnection.TEMPLATE_NAME
)
#ConfigurationProperties(prefix = "internal.mongodb")
public class InternalDataMongoConnection extends BaseMongoConfig{
public static final String TEMPLATE_NAME = "internalDataMongoTemplate";
#Override
#Bean(name = InternalDataMongoConnection.TEMPLATE_NAME)
public MongoTemplate mongoTemplate() {
MongoClient client = getMongoClient(getAddress(), getCredentials());
SimpleMongoDbFactory factory = getSimpleMongoDbFactory(client,
getDatabaseName());
return new MongoTemplate(factory);
}
}
As you can see, I use EnableMongoRepositoriesto define which repository should use which connection.
My repositories are defined just like it is described in the Spring documentation.
However, here is one example which is located in package backend.repository.customer:
public interface ContactHistoryRepository extends MongoRepository<ContactHistoryEntity, String> {
public ContactHistoryEntity findById(String id);
}
The problem is that my backend always only uses the primary connection with this setup. Interestingly, when I remove the beanname for the MongoTemplate (just #Bean) the backend then uses the secondary connection (InternalMongoDataConnection). This is true for all defined repositories.
My question is, how can I achieve that my backend really take care of both connections? Probably I missed to set another parameter/configuration?
Since this is a pretty extensive post I apologise if I forgot something to mention. Please ask for missing information in the comments.
I found the answer.
In my package structure there was a empty configuration class (of my colleague) with the annotation #Configurationand #EnableMongoRepositories. This triggered the automatic wiring process of Stpring Data and therefore led to the problems I reported above.
I simply deleted the class and now it works as it should!
Does anyone have any experience with getServiceReference returning null for what seems like no reason?
The following bundle registers the service, and then proceeds to confirm that it's registered (whether or not this is even a valid test from the same package, idk).
package db.connector;
...
public class Activator implements BundleActivator {
private static ServiceRegistration registration;
...
public void start(BundleContext _context) throws Exception {
DatabaseConnector dbc = new DatabaseConnectorImpl();
registration = context.registerService(
DatabaseConnector.class.getName(),
dbc, null);
checkServiceRegistered();
}
...
public void checkServiceRegistered() {
System.out.println("Printing all entries:");
ServiceReference sr = context.getServiceReference(DatabaseConnector.class.getName());
DatabaseConnector dbc = (DatabaseConnector) context.getService(sr);
List<Protocol> result = dbc.getAllProtocols();
for(int i=0; i<result.size(); i++) {
Protocol p = result.get(i);
System.out.println("\t" + p.getId()+": "+p.getName()+"("+p.getOwner()+")");
}
}
}
The output runs successfully, everything seems OK. Checking in the karaf webconsole, the service seems to be registered correctly:
267 [db.connector.DatabaseConnector] database-connector (144)
The code to get the registered service is as follows:
import db.connector.DatabaseConnector;
...
public List<Protocol> printAllEntries() {
ServiceReference sr = Activator.getContext().getServiceReference(DatabaseConnector.class.getName());
DatabaseConnector dbc = (DatabaseConnector) Activator.getContext().getService(sr);
return dbc.getAllProtocols();
}
...
The DatabaseConnector bundle exports the correct package, and the one using the service imports the same.
What could possibly be going wrong here? I'm at a complete loss.
It looks alright.
What comes to mind: Is the ordering ok? Are you sure the registration is done before checking the reference? The way you check in printAllEntries you check if the service is there on just that moment. As OSGi bundles can come and go, this isn't a reliable way to check. You should use either a ServiceTracker, or better still something like Declarative Services or Blueprint.
You could add a ServiceListener to the BundleContext, then you can print out what's happening in what order.
Hope this helps.
Turns out, it was just that I didn't refresh the OSGi bundles. My servlet was pointing to a now-obsolete bundle ID, so of course the service find was failing.
In my GWT application i'm trying to setup a DI mechanism wihich would allow me to have all the commonly necessary stuff at hand everywhere. I'm using google-gin which is an adaptation of guice for GWT. I have an injector interface defined as this:
#GinModules(InjectionClientModule.class)
public interface MyInjector extends Ginjector {
public PlaceController getPlaceController();
public Header getHeader();
public Footer getFooter();
public ContentPanel getContent();
public EventBus getEventBus();
public PlaceHistoryHandler getPlaceHistoryHandler();
}
My injection module is this:
public class InjectionClientModule extends AbstractGinModule {
public InjectionClientModule() {
super();
}
protected void configure() {
bind(Header.class).in(Singleton.class);
bind(Footer.class).in(Singleton.class);
bind(ContentPanel.class).in(Singleton.class);
bind(EventBus.class).to(SimpleEventBus.class).in(Singleton.class);
bind(PlaceController.class).toProvider(PlaceControllerProvider.class).asEagerSingleton();
bind(PlaceHistoryHandler.class).toProvider(PlaceHistoryHandlerProvider.class).asEagerSingleton();
}
}
When calling MyInjector injector = GWT.create(MyInjector.class); i'm gettign the following exception:
java.lang.NullPointerException: null
at com.google.gwt.inject.rebind.BindingsProcessor.createImplicitBinding(BindingsProcessor.java:498)
at com.google.gwt.inject.rebind.BindingsProcessor.createImplicitBindingForUnresolved(BindingsProcessor.java:290)
at com.google.gwt.inject.rebind.BindingsProcessor.createImplicitBindingsForUnresolved(BindingsProcessor.java:278)
at com.google.gwt.inject.rebind.BindingsProcessor.process(BindingsProcessor.java:240)
at com.google.gwt.inject.rebind.GinjectorGeneratorImpl.generate(GinjectorGeneratorImpl.java:76)
at com.google.gwt.inject.rebind.GinjectorGenerator.generate(GinjectorGenerator.java:47)
at com.google.gwt.core.ext.GeneratorExtWrapper.generate(GeneratorExtWrapper.java:48)
at com.google.gwt.core.ext.GeneratorExtWrapper.generateIncrementally(GeneratorExtWrapper.java:60)
at com.google.gwt.dev.javac.StandardGeneratorContext.runGeneratorIncrementally(StandardGeneratorContext.java:647)
at com.google.gwt.dev.cfg.RuleGenerateWith.realize(RuleGenerateWith.java:41)
at com.google.gwt.dev.shell.StandardRebindOracle$Rebinder.rebind(StandardRebindOracle.java:78)
at com.google.gwt.dev.shell.StandardRebindOracle.rebind(StandardRebindOracle.java:268)
at com.google.gwt.dev.shell.ShellModuleSpaceHost.rebind(ShellModuleSpaceHost.java:141)
at com.google.gwt.dev.shell.ModuleSpace.rebind(ModuleSpace.java:585)
at com.google.gwt.dev.shell.ModuleSpace.rebindAndCreate(ModuleSpace.java:455)
at com.google.gwt.dev.shell.GWTBridgeImpl.create(GWTBridgeImpl.java:49)
at com.google.gwt.core.client.GWT.create(GWT.java:97)
The problem is that the PlaceController class actually depends on one of the other dependencies. I've implemented it's provider like this:
public class PlaceControllerProvider implements Provider<PlaceController> {
private final PlaceController placeController;
#Inject
public PlaceControllerProvider(EventBus eventBus) {
this.placeController = new PlaceController(eventBus);
}
#Override
public PlaceController get() {
return placeController;
}
}
what should i change for this to work?
Old question but having the same problem I kept falling here. I finally found the way to know which class is messing during ginjection.
When I launch my app in development mode and put stack to Trace, I noticed there is a step called : "Validating newly compiled units".
Under this, I had an error but I didn't notice it since I had to expand 2 nodes which weren't even in red color.
The error was "No source code available for type com.xxx.xxxx ...", which was due to a bad import on client side which couldn't be converted to Javascript.
Hope this may help other here !
While I'm not actually seeing how the errors you're getting are related to the PlaceController being injected, I do see that the provider is returning a singleton PlaceController even if the provider were not bound as an eager singleton or in a different scope. The correct way to write that provider would be:
public class PlaceControllerProvider implements Provider<PlaceController> {
private final EventBus eventBus;
#Inject
public PlaceControllerProvider(EventBus eventBus) {
this.eventBus = eventBus;
}
#Override
public PlaceController get() {
return new PlaceController(eventBus);
}
}
Let guice handle the scoping i.e. "Letting guice work for you".
Other than that, I almost bet that your problem is due to the use of asEagerSingleton. I recommend you try this with just in(Singleton.class) and I further posit that you didn't really need the singleton to be eager. It seems others had problems with the behavior too, there's some indication that it has to do with overusing asEagerSingleton or misunderstanding the #Singleton annotation in a few cases.
I also got a lot of NullPointerException warnings using GIN 1.x with no real explanation of what happened. When I upgraded to gin 2.0 I was told with high accuracy what the error was. You might be helped by upgrading to the 2.0 version that was released a year after you asked this question.
Had the same problem problem, same trace, and the error was that I used "server" classes in my "client" classes, so GIN can't find these classes.
I mean by "server" and "client" the packages in my project.
Hope this could help
I am working in J2EE 5 using JPA, I have a working solution but I'm looking to clean up the structure.
I am using EntityListeners on some of the JPA objects I am persisting, the listeners are fairly generic but depend on the beans implementing an interface, this works great if you remember to add the interface.
I have not been able to determine a way to tie the EntityListener and the Interface together so that I would get an exception that lead in the right direction, or even better a compile time error.
#Entity
#EntityListener({CreateByListener.class})
public class Note implements CreatorInterface{
private String message;....
private String creator;
....
}
public interface CreatorInterface{
public void setCreator(String creator);
}
public class CreateByListener {
#PrePersist
public void dataPersist(CreatorInterface data){
SUser user = LoginModule.getUser();
data.setCreator(user.getName());
}
}
This functions exactly the way I want it to, except when a new class is created and it uses the CreateByListener but does not implement the CreatorInterface.
When this happens a class cast exception is thrown somewhere deep from within the JPA engine and only if I happen to remember this symptom can I figure out what went wrong.
I have not been able to figure a way to require the interface or test for the presence of the interface before the listener would be fired.
Any ideas would be appreciated.
#PrePersist
public void dataPersist(Object data){
if (!(data instanceof CreatorInterface)) {
throw new IllegalArgumentException("The class "
+ data.getClass()
+ " should implement CreatorInterface");
}
CreatorInterface creatorInterface = (CreatorInterface) data;
SUser user = LoginModule.getUser();
creatorInterface.setCreator(user.getName());
}
This does basically the same thing as what you're doing, but at least you'll have a more readable error message indicating what's wrong, instead of the ClassCastException.
I just come from my tiny nice JavaSE/Guice world and am currently discovering the path of "carried by the container"-EE6. After having some trouble with Glassfish3.1, I just switched to JBoss and am now facing a problem that shouldnt be one.
As infrastructural assisting class, im trying to create a generic repository/DAO for any kind of entity. In a very simple manner, this might look like this one.
public class Repository<E, K extends Serializable & Comparable<K>> {
private final Instance<EntityManager> entityManagerInstance;
protected final Class<E> getDomainObjectClass() {
return domainObjectClass;
}
private final Class<E> domainObjectClass;
protected final EntityManager getEntityManager() {
return entityManagerInstance.get();
}
#Inject
public Repository(Instance<EntityManager> entityManageryProvider, Provider<E> domainObjectProvider) {
//This is a dirty hack, sadly :(
domainObjectClass = (Class<E>)domainObjectProvider.get().getClass();
this.entityManagerInstance = entityManageryProvider;
}
public final void persist(E domainObject) {
final EntityManager em = getEntityManager();
em.persist(domainObject);
}
public final Collection<E> getAllEntities() {
final EntityManager em = getEntityManager();
final CriteriaBuilder cb = em.getCriteriaBuilder();
final CriteriaQuery<E> query = cb.createQuery(getDomainObjectClass());
final List<E> result = em.createQuery(query).getResultList();
return Collections.unmodifiableList(result);
}
public final E find(K id) {
Preconditions.checkNotNull(id);
final EntityManager em = getEntityManager();
return em.find(getDomainObjectClass(), id);
}
// [...]
}
Now there may be a bean that does not require entity-dependent query capabilities but just a repository of a certain entity type, like (might be a test case):
public class DomainObjectARepositoryTest{
#Inject
Repository<DomainObjectA, PersistableUUID> domainObjectARepository;
#Test
public void testMitarbeitererstellung() {
for (DomainObjectA a : domainObjectARepository.getAllEntities()) {
// do cool stuff
}
}
}
Unfortunatly Weld does not seem to like this kind of generic injection. At deployment time, I get the following error:
state=Create: org.jboss.weld.exceptions.DeploymentException: WELD-001408 Unsatisfied dependencies for type [Repository<DomainObjectA , PersistableUUID>] with qualifiers [#Default] at injection point [[field] #Inject sompackage.DomainObjectARepositoryTest.domainObjectARepository]
Am I missing something or did they just forgot to implement generic injects? As far as I understand the generic stuff, it is erasured after compiletime anyway - even this worked so fine in guice3 so far.
Kind regards,
avi
edit: found a comment by garvin king that this behavior is in the spec, but not implemented in weld, (staement was in june 2009)
That's rather a long comment than a complete answer to your question, but might point you in the right direction:
I'm following the discussions in seam-dev & weld-dev since quite some time, and do not remember that anything like this ever popped up. So my guess would be that it hasn't been on the agenda ever since Gavin commented about it.
What you can do relatively easy to verify this assumption:
(a) Obtain a reference to the BeanManager and query it for the relevant bean type (or just for Object to be on the save side), of course you will have to remove #Inject in DomainObjectARepositoryTest in order to get the application started.
(b) Register an extension and listen to ProcessBean to what comes up during the deployment. That would be my suggested way to go, you'll find more information here.
With that outcome you should definitely be able to tell if there are any bean types Repository<E, K extends Serializable & Comparable<K>> hanging around :-)
Would be cool if you'd report back here with the results and also considered filing a Jira issue in the negative case.