Bulk operations using spring-data-mongodb-reactive - reactive-programming

Is it possible to perform MongoDB bulk operations (like update/upsert) using ReactiveMongoTemplate or spring-data-mongodb-reactive?

Related

MongoDB Bulk Operations across whole DB

I understand that MongoDB has the ability to do bulkWrite / bulkExec per collection... But what I don't understand is how to do it across the whole database...
Currently I'm doing things in parallel via Promise.all([collectionA.op1,collectionB.op2...]) but this seems incredibly inefficient as it's doing a new network request to Mongo for each operation.
It seems that if I could bulk up all the instructions I have and just send them to Mongo to operate on it would be much more efficient.
Does MongoDB support this? If not, why wouldn't it?

MongoDB retryable writes in unordered bulk operation

I am building a bulk operation for my application and it only consist of single-document write operations.
However, I need each operation to have mongodb "retryable writes" enabled correctly.
So I am wondering if an unordered bulk write works just fine for it or wether it only works with an ordered bulk operation (which would be less efficient) ?
Beside, I have correctly added the retryable write option in my connection string.
Thanks in advance,
Yes, the operations can be retried in an unordered operation. From the MongoDB docs:
Bulk write operations that only consist of the single-document write operations. A retryable bulk operation can include any combination of the specified write operations but cannot include any multi-document write operations, such as updateMany.
Note that the write is only retried a single time, so the application still needs to be able to deal with write failures.

Work around to create collection within transaction in mongoDB

I have REST apis that creates mongoDB collection at runtime and keeps the collection name in another collection.
MongoDB now supports transaction and does allow only CRUD operations within a transaction not create collection operation.
Im thinking that to keep all the collection names within a transaction in request context and create the collections once transaction is completed. Is there any other way or workaround to solve this ?
Starting from Mongo 4.4, you can create collections and indexes inside transactions. Documentation

How to Parrallelize Read operation in MongoDB in Spark?

I am using mongodb/mongo-hadoop
(https://github.com/mongodb/mongo-hadoop/wiki/Spark-Usage#python-example) but was confused on how I can do parallel read operations.
By parallel read operations means, do concurrent read operation on my MongoDB index. I have indexed my DB based on Timestamp, and want to query data bw T1-T2, T3-T4, etc. parallelly.

Performance gain by using bulk inserts vs regular inserts in MongoDB

What is the performance gain by using bulk inserts vs regular inserts in MongoDB and pymongo specifically. Are bulk inserts just a wrapper for regular inserts?
Bulk inserts are no wrappers for regular inserts. A bulk insert operation contains many documents sent as a whole. It saves as many database round trips. It is much more performant since you don't have to send each document over the network separately.
#dsmilkov Thats an advantage as you dont have to open a connection every single time.