microprofile-config custom ConfigSource using JPA - jpa

I am currently trying to setup a custom ConfigSource reading config values from our DB2. As the ConfigSources are loaded via ServiceLoader it looks like there is no way to access the database via JPA as the ServiceLoader is scanning for custom ConfigSources very early.
Any ideas?

You can anotate your ConfigSource as a singleton session bean and mark it for eager initialization during the application startup sequence.
Also you need to define a static member variable holding your config values.
With this setup you can lazy load your properties values from an injected JPA source or also from any other CDI or EJB.
See the following example Code
#Startup
#Singleton
public class MyConfigSource implements ConfigSource {
public static final String NAME = "MyConfigSource";
public static Map<String, String> properties = null; // note to use static here!
#PersistenceContext(unitName = ".....")
private EntityManager manager;
#PostConstruct
void init() {
// load your data from teh JPA source or EJB
....
}
#Override
public int getOrdinal() {
return 890;
}
#Override
public String getValue(String key) {
if (properties != null) {
return properties.get(key);
} else {
return null;
}
}
#Override
public String getName() {
return NAME;
}
#Override
public Map<String, String> getProperties() {
return properties;
}
}
ConfigSources are POJO’s because if a CDI bean expected config to be injected into it at startup based on a ConfigSource that had dependencies on CDI’s you could get into startup looping issues.
For this reason the example CongigSoruce is constructed twice - once at the beginning from the Config-API and later from the CDI implemenation on #PostConstruct. With the static variable 'properties' we overload the values from the already constructed ConfigSource. Of course you can also separate the code in two classes if you like.

Related

Problems when using EntityFilteringFeature and SelectableEntityFilteringFeature with Jersey 2

I'm new to Jersey 2 and JAX-RS, so probably I'm missing something.
What I'm trying to do is a test program to define a coding style in rest services developing.
The test was written in JAVA and uses JERSEY 2.22.2, JDK 1.8.31, MOXY AS JSON Provider.
I defined a Resource with GET methods to support LIST/DETAIL. Due to the size of my POJO, I used some filters and everything was fine.
// 1) First of all I defined the annotation.
#Target({ElementType.TYPE, ElementType.METHOD, ElementType.FIELD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#EntityFiltering
public #interface MyDetailView {
public static class Factory extends AnnotationLiteral<MyDetailView>
implements MyDetailView {
private Factory() {
}
public static MyDetailView get() {
return new Factory();
}
}
// 2) Once defined the annotation, I used to
// programmaticaly exclude the list of subItems in the response...
#XmlRootElement
public class MyPojo {
...
//*** THIS SHOULD BE FILTERED IF THE ANNOTATION IS NOT SPECIFIED IN THE RESPONSE ***
#MyDetailView
private List<SubItem> subItems = new ArrayList<SubItem>();
public List<SubItem> getSubItems() {
return subItems;
}
public void setSubItems(List<SubItem> subItems) {
this.subItems = subItems;
}
}
// 3) I registered the EntityFilteringFeature
public class ApplicationConfig extends ResourceConfig {
public ApplicationConfig() {
....
register(EntityFilteringFeature.class);
}
// 4) Finally, I wrote the code to include/exclude the subItems
/*
The Resource class has getCollection() and getItem() methods...
getCollection() adds the annotation only if filterStyle="detail"
getItem() always add the annotation
*/
#Path(....)
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
public class MyResource extends SecuredResource {
//filterStyle -> "detail" means MyDetailAnnotation
#GET
public Response getCollection(
#QueryParam("filterStyle") String filterStyle,
#Context UriInfo uriInfo) {
//THIS CODE AFFECTS THE RESPONSE
boolean detailedResponse = "detail".equals(filterStyle);
Annotation[] responseAnnotations = detailedResponse
? new Annotation[0]
: new Annotation[]{MyDetailView.Factory.get()};
//pojo collection...
MyPagedCollection myCollection = new MyPagedCollection();
//.....
ResponseBuilder builder = Response.ok();
return builder.entity(myCollection, responseAnnotations).build();
}
#GET
#Path("/{id}")
public Response getItem(#PathParam("{id}") String idS, #Context UriInfo uriInfo) {
MyPOJO pojo = ...
Annotation[] responseAnnotations = new Annotation[]{MyDetailView.Factory.get()};
return Response.ok().entity(pojo, responseAnnotations).build();
}
}
After the first test, I tried to use the SelectableEntityFilteringFeature to allow the client to ask for specific fields in the detail, so I changed the ApplicationConfig
public class ApplicationConfig extends ResourceConfig {
public ApplicationConfig() {
....
register(EntityFilteringFeature.class);
register(SelectableEntityFilteringFeature.class);
property(SelectableEntityFilteringFeature.QUERY_PARAM_NAME, "fields");
}
and I've add the "fields" QueryParam to the Resource getItem() method...
#GET
#Path("/{id}")
public Response getDetail(#PathParam({id}) String id,
#QueryParam("fields") String fields,
#Context UriInfo uriInfo) {
....
But as long as I registered the SelectableEntityFilteringFeature class, the EntityFilteringFeature class stopped working. I tried to add "fields" parameter to one of the Resource methods, it worked perfectly. But the MyDetailAnnotation was completely useless.
I tried to register it using a DynamicFeature
public class MyDynamicFeature implements DynamicFeature {
#Override
public void configure(ResourceInfo resourceInfo, FeatureContext context) {
if ("MyResource".equals(resourceInfo.getResourceClass().getSimpleName())
&& "getItem".equals(resourceInfo.getResourceMethod().getName())) {
//*** IS THE CORRECT WAY TO BIND A FEATURE TO A METHOD? ***
//
context.register(SelectableEntityFilteringFeature.class);
context.property(SelectableEntityFilteringFeature.QUERY_PARAM_NAME, "fields");
}
}
Now the questions:
1) Why registering both the SelectableEntityFilteringFeature feature breaks the EntityFilteringFeature?
2) What is the correct way to bind a feature to a method with the DynamicFeature interface?
Thanks in advance.
This is my first post to Stack Overflow, I hope it was written complaining the rules.
Short answer: you can't. It appears to be a bug as of 2.25.1 and up to 2.26(that I tested with). https://github.com/jersey/jersey/issues/3523
SelectableEntityFilteringFeature implictily registers EntityFilteringFeature (As mentioned here). So I don't see a need to add this.
Since you need Annotation based filtering, you can exclude registering SelectableEntityFilteringFeature.
You can just do,
// Set entity-filtering scope via configuration.
.property(EntityFilteringFeature.ENTITY_FILTERING_SCOPE, new Annotation[] {MyDetailView.Factory.get()})
// Register the EntityFilteringFeature.
.register(EntityFilteringFeature.class)
// Further configuration of ResourceConfig.
You can refer to this example for usage and this example for registering the filter.
So you can remove SelectableEntityFilteringFeature and try just the above mentioned way to register it.

guice JpaPersistModule with runtime configuration

I need to share datasource with JpaPersistModule. This datasource is provided by guice injector.
Now the problem I have to build module during configuration phase, but datasource is available only in runtime.
Currently I have following code:
public class MyJpaConfigurationModule implements Module {
private Map<String, Object> jpaProperties = new HashMap<>();
private Module jpaModule = new JpaPersistModule("peristenceUnit").properties(jpaProperties);
public void configure(Binder binder) {
binder.requestInjection(this);
binder.install(jpaModule);
}
#Provides #Singleton
public DataSource provideDatasource() {
return ..... // some data source
}
#Inject
public void setJpaProperties(DataSource dataSource, PersistService persistService) {
jpaProperties.put("dataSource", dataSource);
persistService.start();
}
}
I have checked and it seems that jpa properties map is everywhere injected by reference, so my runtime changes should become visible, but what if this changes in future?
What is the correct way to resolve such conflicts?

OSGI - two objects of a bundle service

I have a bundle that provides a service.
My bundle implementation looks like this:
class ServiceImpl implements Service
{
Object value;
#Override
public void setValue(Object value)
{
this.value = value;
}
#Override
public Object getValue()
{
return value;
}
}
In my java application, I load this bundle to OSGI framework, and create TWO references to the service, in an attempt to have two objects with different values for "value".
Unfortunately, this does not seem to work. The service always returns the last value set by either objects. How can I overcome this issue?
Here's an example for the problem:
Service object1 = context.getService(reference1);
Service object2 = context.getService(reference2);
Integer one= 1;
Integer two =2;
object1.setValue(1);
object2.setValue(2);
System.out.println(object1.getValue() ); //returns 2 !!!!!!!!!!!!!!!!!!
System.out.println(object2.getValue() ); //returns 2
I used ServiceFactory but it seems not useful for my case. What should I do? Thanks.
Both BJ and Balazs offer valuable information, but no solution that works with current versions of the OSGi specification.
What you can do is register your service with a second "Factory" interface. This factory then allows you to create instances of the service. Because you probably don't want to do that manually, you can hide this logic in a ServiceTracker.
There are a few "downsides" to this approach. First of all, you need to register the service and have the instance implement both Factory and Service. Secondly, you always have to use this custom ServiceTracker to access it. If you use a dependency manager that allows you to extend its dependencies (such as Apache Felix Dependency Manager) you can easily hide all of this in a custom ServiceDependency.
Anyway, to show you that this actually works, here is a simple example:
public class Activator implements BundleActivator {
#Override
public void start(final BundleContext context) throws Exception {
context.registerService(Service.class.getName(), new FactoryImpl(), null);
ServiceTrackerCustomizer customizer = new ServiceTrackerCustomizer() {
#Override
public Object addingService(ServiceReference reference) {
Object service = context.getService(reference);
if (service instanceof Factory) {
return ((Factory) service).createInstance();
}
return service;
}
#Override
public void modifiedService(ServiceReference reference,
Object service) {
// TODO Auto-generated method stub
}
#Override
public void removedService(ServiceReference reference,
Object service) {
// TODO Auto-generated method stub
}
};
ServiceTracker st1 = new ServiceTracker(context, Service.class.getName(), customizer);
ServiceTracker st2 = new ServiceTracker(context, Service.class.getName(), customizer);
st1.open();
st2.open();
Service s1 = (Service) st1.getService();
Service s2 = (Service) st2.getService();
s1.setValue("test1");
s2.setValue("test2");
System.out.println(s1.getValue());
System.out.println(s2.getValue());
}
#Override
public void stop(BundleContext context) throws Exception {
}
static interface Factory {
public Object createInstance();
}
static class FactoryImpl extends ServiceImpl implements Factory, Service {
#Override
public Object createInstance() {
return new ServiceImpl();
}
}
static interface Service {
public void setValue(Object value);
public Object getValue();
}
static class ServiceImpl implements Service {
private Object m_value;
#Override
public void setValue(Object value) {
m_value = value;
}
#Override
public Object getValue() {
return m_value;
}
}
}
You need to wait for R6. Pre-R6, each bundle can be exposed to at most one instance of a service. Even registering a ServiceFactory will not change that since the framework will cache the service object from the ServiceFactory to return to the bundle on subsequent calls to getService.
In R6, we introduce service scopes which allows a service implementation to return multiple service objects to a bundle. Using this requires both the service provider and the service consumer to use new API added in R6.
You can play with this now as it is implemented in Eclipse Equinox Luna.
Even if you use ServiceFactory, for the same bundle the same service object will be returned.
There might be a PrototypeServiceFactory in the future as there is an RFP about it: https://github.com/osgi/design/tree/master/rfcs/rfc0195
That would fit to your needs.
Although there might be a PrototypeServiceFactory in the future, I think it is better to solve this use-case programmatically by yourself. E.g.:
Instead of creating a mutuable OSGi service (I do not think creating mutuable services is a good idea) create a factory.
On the client side you would use:
BusinessLogicFactory factory = context.getService(reference);
BusinessLogic object1 = factory.createInstance();
BusinessLogic object2 = factory.createInstance();
...

Morphia converter calling other converters

I want to convert Optional<BigDecimal> in morphia. I created BigDecimalConverter, and it works fine. Now I want to create OptionalConverter.
Optional can hold any object type. In my OptionalConverter.encode method I can extract underlying object, and I'd like to pass it to default mongo conversion. So that if there is string, I'll just get string, if there is one of my entities, I'll get encoded entity. How can I do it?
There are two questions:
1. How to call other converters?
2. How to create a converter for a generic class whose type parameters are not statically known?
The first one is possible by creating the MappingMongoConveter and the custom converter together:
#Configuration
public class CustomConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
// ...
}
#Override
#Bean
public Mongo mongo() throws Exception {
// ...
}
#Override
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter mmc = new MappingMongoConverter(
mongoDbFactory(), mongoMappingContext());
mmc.setCustomConversions(new CustomConversions(CustomConverters
.create(mmc)));
return mmc;
}
}
public class FooConverter implements Converter<Foo, DBObject> {
private MappingMongoConverter mmc;
public FooConverter(MappingMongoConverter mmc) {
this.mmc = mmc;
}
public DBObject convert(Foo foo) {
// ...
}
}
public class CustomConverters {
public static List<?> create(MappingMongoConverter mmc) {
List<?> list = new ArrayList<>();
list.add(new FooConverter(mmc));
return list;
}
}
The second one is much more difficult due to type erasure. I've tried to create a converter for Scala's Map but haven't found a way. Unable to get the exact type information for the source Map when writing, or for the target Map when reading.
For very simple cases, e.g. if you don't need to handle all possible parameter types, and there is no ambiguity while reading, it may be possible though.

Using structuremap with log4net wrapper

I have the following interface:
public interface ILogger
{
void Debug(string message, params object[] values);
void Info(string message, params object[] values);
void Warn(string message, params object[] values);
void Error(string message, params object[] values);
void Fatal(string message, params object[] values);
}
and the following implementation:
public class Log4netLogger : ILogger
{
private ILog _log;
public Log4netLogger(Type type)
{
_log = LogManager.GetLogger(type);
}
public void Debug(string message, params object[] values)
{
_log.DebugFormat(message, values);
}
// other logging methods here...
}
My idea was to use structuremap to instantiate the Log4netLogger class with using the Type of the class that did the logging. However, I can't for the life of me figure out how to pass the type of the calling class to structuremap so that it can be passed to the constructor of the logging implementation. Any advice on how to do that (or a better way) would be most appreciated.
We use a similar ILogger wrapper around log4net and typically use constructor injection. We use an interceptor as a factory method responsible for creating the Logger. Here is our typical registry for logging setup.
public class CommonsRegistry : Registry
{
public CommonsRegistry()
{
For<ILogger>()
.AlwaysUnique()
.TheDefault.Is.ConstructedBy(s =>
{
if (s.ParentType == null)
return new Log4NetLogger(s.BuildStack.Current.ConcreteType);
return new Log4NetLogger(s.ParentType);
});
var applicationPath = Path.GetDirectoryName(Assembly.GetAssembly(GetType()).Location);
var configFile = new FileInfo(Path.Combine(applicationPath, "log4net.config"));
XmlConfigurator.ConfigureAndWatch(configFile);
}
}
The parent type null check is necessary when there are dependencies on concrete types.
The rest is optional log4net setup stuff.
One thing I do like about this setup is the ability to use a null loggers for unit testing.
If the type parameter is context-specific, I don't think this is going to work as shown. If you need to pass something context specific in the constructor, you are likely going to have to create a factory interface and implementation that returns an instance of the ILogger:
public interface ILoggerFactory
{
ILogger Create(Type type);
}
public class LoggerFactory : ILoggerFactory
{
public ILogger Create(Type type)
{
return new Log4netLogger(type);
}
}
It might be possible to bootstrap StructureMap to supply the instance you want based on the type, but that assumes a limited number of types that you know in advance.
I really need to get out of the habit of answering my own question, but for those who run across it, here's the answer.
return ObjectFactory.With(type).GetInstance<T>();
I actually have a wrapper to structuremap (to avoid exposing the structuremap dependency to my app) that looks like the following:
public static class ServiceManager
{
public static T Get<T>()
{
return ObjectFactory.GetInstance<T>();
}
public static T Get<T>(Type type)
{
return ObjectFactory.With(type).GetInstance<T>();
}
}
Any time in the code I need a logger, I call the following:
ServiceManager.Get<ILogger>(GetType()).Info("Logging page view...");