Apply Spring Data's ReactiveCrudRepository to Redis - reactive

I'm playing with Spring Boot 2 with webflux. I'm trying to use ReactiveSortingRepository to simplify redis ops.
public interface DataProfileRepository extends ReactiveSortingRepository<DataProfileDTO, String> {
}
Simply use this interface
Mono<DataProfileDTO> tmp = this.dataProfileRepository.findById(id);
exception:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [com.tradeshift.dgps.dto.DataProfileDTO] to type [reactor.core.publisher.Mono<?>]
at org.springframework.core.convert.support.GenericConversionService.handleConverterNotFound(GenericConversionService.java:321) ~[spring-core-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:194) ~[spring-core-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:174) ~[spring-core-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.data.repository.util.ReactiveWrapperConverters.toWrapper(ReactiveWrapperConverters.java:197) ~[spring-data-commons-2.0.2.RELEASE.jar:2.0.2.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutionResultHandler.postProcessInvocationResult(QueryExecutionResultHandler.java:104) ~[spring-data-commons-2.0.2.RELEASE.jar:2.0.2.RELEASE]
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:587) ~[spring-data-commons-2.0.2.RELEASE.jar:2.0.2.RELEASE]
is thrown.
The behavior of this repository didn't match reactor, I can see in the debug mode, an actual DataProfileDTO was fetched from redis. And failed when trying to:
GENERIC_CONVERSION_SERVICE.convert(reactiveObject, targetWrapperType);
in ReactiveWrapperConverters.toWrapper
I went googling, it seems Spring Data Redis 2.0 doesn't mention reactive repository support. I'm wondering if anything I did wrong in my code or Spring Data Redis 2.0 just doesn't support ReactiveCrudRepository yet.

According to Spring's documentation for Reactive Redis Support, the highest level of abstraction to interface with Redis with reactive support is ReactiveRedisTemplate. The ReactiveRedisConnection is lower abstraction that works with binary values (ByteBuffer) as input and output.
There's no mention of support of reactive repositories.
You can also consult the official reactive examples in the spring-data github repo.
In order for all this to work, you need to have reactive support in the driver you're using - currently that would be lettuce.
Although not ideal, an alternative is Flux.fromIterable(). You can use blocking repository and handle the result in reactive way.
public interface DataProfileRepository extends CrudRepository<DataProfileDTO, String> {
}
And wrap it:
Flux.fromIterable(dataProfileRepository.findById(id)), DataProfileDTO.class))

Related

Profiles In dagger

I am new to dagger and I am searching for how can we implement functionality like spring profiles in dagger-2.x. I want different beans for my devo and prod environments, but I am using dagger framework with Java.
#Provides
#Singleton
public void providesDaggerCoffeeShopClient(Stage stage) {
DaggerCoffeeShop.builder()
.dripCoffeeModule(new DripCoffeeModule())
.qualifier(stage)
.build();
}
Here, I want to skip this bean creation if stage is "Devo". Any help will be appreciated.
Well. I have met this question 2 days ago. And since performed research about this matter. I was looking for a solution that would allow me to be able to run application with different profiles passed as a system property on the application run like:
java -Denv=local-dev-env -jar java-app.jar
The only appropriate solution I was able to find is to follow the oficial documentation testing guide:
https://dagger.dev/dev-guide/testing
and devide my one module into different modules, in particular I had to separate and substitute data base dependency when I want to run my app locally avoiding connection to real DB and executing any command against real DB.
And when I run my app I perform check on system property like:
public boolean isLocalDevEnv() {
return Environments.LOCAL_DEV.envName.equals(System.getProperty("env", Environments.PRODUCTION.envName));
}
and if the system property DOES NOT contain the property I am looking for, then I
create the PRODUCTION instance of my component (that is configured to use production modules):
DaggerMyAppComponent.create()
Which approximately looks like:
#Component(modules = {MyAppModule.class, DaoModule.class})
#Singleton
public interface MyAppComponent {...}
otherwise, I create loca-dev-env version of the component that uses the version of the module that produces mock of Dao that would be creating real connection to real Data Base otherwise:
DaggerMyAppLocalDevEnvComponent.create()
Which approximately looks like:
#Component(modules = {MyAppModule.class, DaoMockModule.class})
#Singleton
public interface MyAppLocalDevEnvComponent {...}
Hope it was clear, so just think of Spring Profiles for dagger 2 from the perspective of system properties and programmatic decision making. This approach definitely requires ALOT of boilerplate code in comparison to Spring's Profiles implementation, but it is the only viable approach I was able to come up with.
Hope it helps.

SumoLogic RESTFul API C# client

Is there any C# client anyone know about that we can use to run queries against SumoLogic? I see they have a Java Client but cannot find a corresponding C# client.
You can use SumoLogicMessageSender class.
You can find using this class here.
But, I don't recommend to use GetResult() like it writes in original code :
// this maintains synchronous behavior for single event scenarios.
this.SumoLogicMessageSender
.TrySend(bodyBuilder.ToString(), this.SourceName, this.SourceCategory, this.SourceHost)
.GetAwaiter()
.GetResult();

Example Spring integration DSL for JPA Inbound Channel adapter

I can't find a useful example for polling a JPA source for inbound data. I know how to do this in XML but can't figure out how do it in DSL.
In short what I want to do is periodically poll a JPA repository for records then put the records into a flow that will do the usual filtering/transforming/executing.
Kind regards
David Smith
You are right: there is no yet JPA components support in the Spring Integration Java DSL. Feel free to raise a JIRA (JavaDSL component) on the matter and we'll take care about this demand. Feel free to contribute as well!
Meanwhile I can help you to figure out how to do that without high-level API.
The <int-jpa:inbound-channel-adapter> is based on the JpaPollingChannelAdapter and JpaExecutor objects (exactly them we will use for DSL API). You just must to configure #Bean for JpaExecutor and use it like this:
#Bean
public JpaExecutor jpaExecutor(EntityManagerFactory entityManagerFactory) {
JpaExecutor jpaExecutor = new JpaExecutor(entityManagerFactory);
jpaExecutor.setJpaQuery("from Foo");
....
return jpaExecutor;
}
#Bean
public IntegrationFlow jpaFlow(JpaExecutor jpaExecutor) {
return IntegrationFlows.from(new JpaPollingChannelAdapter(jpaExecutor))
.split()
.transform()
....
}
Everything else will be done by framework as usual for existing DSL components API.
UPDATE
How to provide auto-startup= property when creating JpaPollingChannelAdapter programmatically? Also, is it possible to get this bean and invoke .start(), .stop() using control-bus?
See, Gary's answer. The Lifecycle control is a responsibility of Endpoint in our case it is SourcePollingChannelAdapter. So, you should specify that second Lambda argument, configure the .autoStartup() and .id() there to be able to inject the SourcePollingChannelAdapter for your JpaPollingChannelAdapter and operate with it for your purpose. That id really can be used from control-bus to start()/stop() at runtime.
Yes, I agree JpaPollingChannelAdapter is unfortunate name for that class because it is really a MessageSource implementation.
Wire up a JpaPollingChannelAdapter as a #Bean and use
IntegrationFlows.from(jpaMessageSource(),
c -> c.poller(Pollers.fixedDelay(1000)))
.transform(...)
...
See the DSL Reference for configuration options.
This one's near the top (with a different message source).

How do I detect whether a mongodb serializer is already registered?

I have created a custom serializer for mongoDB.
I can register it and it works as expected.
However the my application sometimes throws an error because it tries to register the serializer twice.
How do I detect whether a serializer has already been registered and thus stop my application from registering a second time?
If you are using
BsonSerializer.RegisterSerializer(typeof (Type), typeSerializer);
you might get this error "there is already a serializer registered for type". Because you cannot register the same type of serializer 2 times. But you can write your own serializer and this serializer will work before default serializers.
For instance: if you want to use local DateTime instead of Utc which is default.
all you need to do is that writing a class implementing IBsonSerializationProviderand register this provider to BsonSerializer as soon as possible!
here is the sample code.
public class LocalDateTimeSerializationProvider : IBsonSerializationProvider
{
public IBsonSerializer GetSerializer(Type type)
{
return type == typeof(DateTime) ? DateTimeSerializer.LocalInstance : null;
}
}
and to be able to register
BsonSerializer.RegisterSerializationProvider(new LocalDateTimeSerializationProvider());
I hope this helps, you can also read the original documentation in here
this .net driver version of mongodb is 2.4!
TL;DR: Ig you are lazy, use BsonSerializer.LookupSerializer or BsonMemberMap.GetSerializer. To do it right, make sure the registration code is called once and only once.
The best approach to avoid this is to make sure the serializer is registered only once. It's a good idea to have some global startup code that registers anything that is global to the application once, and only once. That includes stuff like dependency injector configuration, tools like automapper and the mongodb driver. If you call this code only once and from a single point in code, you don't need to worry about thread safety, dead locks or similar troubles.
The MongoDB driver configuration settings are thread-safe, but don't assume that this is true for all software packages that you might need to configure. Also, locking can be very expensive performance wise if your code is multi-threaded, for instance in a web-application. Last but not least, that lookup you're doing might not be trivial in the first place, because some methods need to walk an entire inheritance tree.

GWT RPC: hotswap vs POJO

I have encountered following problem.
Currently I'm working with colleague on GWT project.
We are using RPC async service. We often need to send and receive state object which is a HashMap.
We have bunch of service methods which are always have state as parameter and as a return type:
HashMap<String, Serializable> fillAndGetUI(HashMap<String, Serializable> state) throws ProjectServiceException;
I'm telling not to use this because we have Serializable interface in method declaration which is not good for RPC and GWT compilation.
But: HashMap is useful while we can use hotswap instead of restarting server each time (it's enough to write method put and get).
My suggestion was to use POJO, but we can loose hotswap abliliy which is critical.
What is the solution to not use HashMap in declarations and have Hotswap ability in the same time ? Can RequestFactory solve this issue? (We are using GWT 2.1. version change is not an option)
The easiest solution use plain old RequestBuilder, JSON and Overlay Types. RequestFactory will not help you