Reactive Spring Data Mongo Merge Operation - spring-data

I want to write to MongoDB with Spring Integration and Project Reactor. My needed command is a merge operation, so I started that with the following snippet:
MergeOperation mergeOperation = Aggregation.merge()
.intoCollection("someCollection")
.on("_id")
.whenMatched(MergeOperation.WhenDocumentsMatch.mergeDocuments())
.whenNotMatched(MergeOperation.WhenDocumentsDontMatch.discardDocument())
.build();
#Bean
public IntegrationFlow dataPipeline() {
return IntegrationFlows.from(somePublisher)
// .handle(-----) - MergeOperation usage syntax
.get();
}
I would like to know what is the recommended way of using the merge command with Reactive Spring Data Mongo, and if its supported and possible with reactive streams. Since I've seen that there's a dedicated class for reactive aggregations, I wonder if the absent of reactive merge operation class means no support for the merge operation with reactive streams. If it is possible, I'd like to get some help with the syntax

Related

Quarkus Panache Mongo Watch Reactive

im learning quarkus and mongo panache, and cant find a way to implement java code using watch and change stream but i cant find a way to make it work with mongo panache reactive actually im using a CRON schreduled but i dont want to send a find every 1 hour, reading this document cant find the way to make it work in a reactive mongodb panache with quarkus.
MongoCollection<Grade> grades = db.getCollection("grades", Grade.class);
ChangeStreamIterable<Grade> changeStream = grades.watch();
changeStream.forEach((Consumer<ChangeStreamDocument<Grade>>) System.out::println);
This is my repository
#ApplicationScoped
#RegisterForReflection
public class BrandRepository implements ReactivePanacheMongoRepositoryBase<Brands, Integer> {}
Documentacion Mongo
Thank You

Reactive Spring cloud stream who does the subscribing

I use kafka and spring cloud stream in functional programing model.
I want to use the reactive api
So I have a Function bean that takes Flux and return flux
The returned flux is created in separate class
Do I need to subscribe to activate the new/return flux?
That will not work if I understand you correctly. The expectation for "streaming cases" is that your function adds operations to the incoming flux and returns it. The framework will subscribe to what your function has returned and the stream begins. Because of that if you create a new instance of Flux it would not work.
What I mean it is by design.
Once we have truly reactive binders (which we don't at the moment)then things will change.

Azure Cosmos DB MongoDB API 4.0 transaction feature not working with `spring-boot-starter-data-mongodb`

I have the cosmos mongodb server 4.0 in placed and also have
MongoTransactionManager bean setup and applied #TransactionalOn the method in my poc like below:
#Configuration
class Config extends AbstractMongoConfiguration {
#Bean
MongoTransactionManager transactionManager(MongoDbFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
}
#Service
class DocumentService {
private final MongoOperations operations;
DocumentService(MongoOperations operations) {
this.operations = operations;
}
#Transactional
void insertDocuments() {
operations.insert(documentOne);
operations.insert(documentTwo);
// manually raise error here
Int error = 1/0
}
}
What I am expecting is it should not insert any record to the DB until it reach the end of the method without any error. In the snippet above, when I am in the debugging , I can see that each insert is being stored in the DB and when error happened, no rollback is being triggered which is completely not ACID at all.
With the same sample, I am able to get the transaction feature working in a pure MongoDB 4.0 server.
And I cannot find any sample or documentation of Java or spring-boot-data-mongo implementation for the transaction feature.
So my question is:
Does Azure Cosmos DB MongoDB API 4.0 Transaction feature compatible with spring-boot-starter-data-mongodb?
Do we have any sample for Java and spring-boot-starter-data-mongodb for transaction feature?
Dependencies I used:
Spring-boot-starter-parent 2.4.3 and spring-boot-starter-data-mongodb
CosmosDB transactions are not supported for partitioned collections.
Cite from Microsoft's devblog https://devblogs.microsoft.com/cosmosdb/three-reasons-to-upgrade-to-azure-cosmos-db-api-for-mongodb-4-0/:
Multi-Document Transactions: Multi-document transactions within an unsharded collection support enables you to group together dependent operations and treat them as one operation, while respecting all ACID semantics.
If you're OK with a hard limit of 10,000 RUs for your collection, you can make it unshared and use transactions.

Nested Query in spring batch processing

I want to create an ETL process using Spring Batch, the steps will be read from a DB(s) and insert in one DB so basically I'm collecting similar information from different DB and inserting them in one DB, I have a large complex query that I need to run on those DBs and the result will be inserted in the so called one DB for later processing, my main concert is that I want to reference this query in the JpaPagingItemReader for example, it there a way I can for example add this query in my project as .sql file and then reference it in the reader?
Or any other solution I can follow?
Thank you
it there a way I can for example add this query in my project as .sql file and then reference it in the reader? Or any other solution I can follow?
You can put your query in a properties file and inject in your reader, something like:
#Configuration
#EnableBatchProcessing
#PropertySource("classpath:application.properties")
public class MyJob {
#Bean
public JpaPagingItemReader itemReader(#Value("${query}") String query) {
return new JpaPagingItemReaderBuilder<>()
.queryString(query)
// set other reader properties
.build();
}
// ...
}
In this example, you should have a property query=your sql query in application.properties. This is actually the regular Spring property injection mechanism, nothing Spring Batch specific here.

Can couchbase be used as the underlying JobRepository for spring-batch?

We have a requirement where we have to read a batch of a entitytype from the database, submit info about each entity to a service which will callback later with some data to update in the caller entity, save all the caller entities with the updated data. We thought of using spring-batch however we use Couchbase as our database which is eventually consistent and has no support for transactions.
I was going through the spring-batch documentation and I came across the Spring Batch Meta-Data ERD diagram here :
https://docs.spring.io/spring-batch/4.1.x/reference/html/index-single.html#metaDataSchema
With the above information in mind, my question is:
Can Couchbase be used as the underlying job-repository for spring-batch? What are the things I should keep in mind if its possible to use it? Any links to example implementations would be welcome.
The JobRepository needs to be transactional in order for Spring Batch to work properly. Here is an excerpt from the Transaction Configuration for the JobRepository section of the reference documentation:
The behavior of the framework is not well defined if the repository methods are not transactional.
Since Couchbase has no support for transactions as you mentioned, it is not possible to use it as an underlying datasource for the JobRepository.