sequential queries in MongoDB queries not working properly sometimes - mongodb

I am executing 2 update queries in sequential manner. I am using generator function & yield to handle asynchronous behaviour of javascript.
var result = yield db.tasks.update({
"_id": task._id,
"taskLog":{$elemMatch:{"currentApproverRole": vcurrentApproverRole,
"currentApprover": new RegExp(employeeCode, 'i')}}
}, {
$set: {
"taskPendingAt": vnextApproverEmpCode,
"status": vactionTaken,
"lastUpdated": vactionTakenTime,
"lastUpdatedBy": employeeCode,
"shortPin":shortPin,
"workFlowDetails":task.workFlowDetails,
"taskLog.$.reason": reason,
"taskLog.$.actionTakenBy": employeeCode,
"taskLog.$.actionTakenByName": loggedInUser.firstName+" "+loggedInUser.lastName,
"taskLog.$.actionTaken": vactionTaken,
"taskLog.$.actionTakenTime": vactionTakenTime
}
});
var vstatus = vactionTaken;
// Below is the query that is not working properly sometimes
yield db.groupPicnic.update({"gppTaskId": task.workFlowDetails.gppTaskId, "probableParticipantList.employeeCode": task.createdBy},
{
$set: {
'probableParticipantList.$.applicationStatus': vactionTaken
}
})
Second update operation does not execute sometimes (Works 9 out of 10 times). I don't seem to figure out how to handle this issue?

ES6 generators are supposed to provide a simple way for writing iterators.
An iterator is just a sequence of values - like an array, but consumed dynamically and produced lazily.
Currently your code does this:
let imAnUnresolvedPromise = co().next();
// exiting app, promise might not resolve in time
By moving forward and -not- waiting on the promise (assuming your app closes) you can't guarantee that it will execute in time, hence why the unstable behaviour your experiencing.
All you have to change is to wait on the promise to resolve.
let resolveThis = await co().next();
EDIT:
Without async/await syntax you'll have to use nested callbacks to guarantee the correct order, like so:
co().next().then((promiseResolved) => {
co().next().then((promiseTwoResolved) => {
console.log("I'm done")
})
});

Related

Is a single Firestore write operation guaranteed to be atomic?

I have a Chat document that represents a chat between two users. It starts out empty, and eventually looks like this:
// chats/CHAT_ID
{
users: {
USER_ID1: true,
USER_ID2: true
},
lastAddedUser: USER_ID2
}
Each user is connected to a different Cloud Run container via websockets.
I would like to send a welcome message to both users once the second user connected. This message must be sent exactly once.
When a user sends a "connected" message to its websocket, the container performs something like the following:
// Return boolean reflecting whether the current container should emit the welcome message to both users
async addUserToChat(userId) {
// Write operation
await this.chatDocRef.set({ activeUsers: { [userId]: true }, lastAddedUser: userId, { merge: true })
// Read operation
const chatSnap = await this.chatDocRef.get();
const chatData = chatSnap.data();
return chatData.users.length === 2 && chatData.lastAddedUser === userId;
}
And there is a working mechanism that allows container A to send a message to a user connected to container B.
The issue is that sometimes, each container ends up concluding that it is the one that should send the welcome message to both users.
I am unclear as to why that would happen given Firestore's "immediately consistency model" (per this). The only explanation I can think of that allows racing condition is that write operations involving multiple fields are not guaranteed to be atomic. So this:
await this.chatDocRef.set({ activeUsers: { [userId]: true }, lastAddedUser: userId, { merge: true })
actually performs two separate updates for activeUsers and lastAddedUser, opening the possibility for a scenario where after partial update of activeUsers by container A, container B completes the write and read operations before container A overwrites lastAddedUser.
But this sounds wrong.
Can anyone shed light on why racing conditions might occur?
I no longer have racing conditions if I base the logic on the server timestamps instead of the lastAddedUser field.
The document is now simpler:
// chats/CHAT_ID
{
users: {
USER_ID1: true,
USER_ID2: true
}
}
And the function looks like this:
// Return boolean reflecting whether the current container should emit the welcome message to both users
async addUserToChat(userId) {
// Write operation
const writeResult = await this.chatDocRef.set({ activeUsers: { [userId]: true }, { merge: true })
// Read operation
const chatSnap = await this.chatDocRef.get();
const chatData = chatSnap.data();
return chatData.users.length === 2 && writeResult.writeTime.isEqual(chatSnap.updateTime);
}
In other words, the condition for sending the welcome message now becomes: the executing container is the container responsible for the update that resulted in having two users.
While the problem is solved, I am still unclear as to why relying on document data (instead of server metadata) opens up the possibility for racing conditions to occur. If anyone knows the explanation behind this phenomenon, please add an answer and I'll accept it as the solution to this question.

WebFlux/Reactor: checking conditions before+after Flux execution with doOnComplete

I'm already querying some external resource with Flux.using(). Now I want to implement a kind of optimistic locking: read some state before query starts to execute and check if it was updated after query is finished. If so - throw some exception to break http request handling.
I've achieved this by using doOnComplete:
final AtomicReference<String> initialState = new AtomicReference<>();
return Flux.just("some", "constant", "data")
.doOnComplete(() -> initialState.set(getState()))
.concatWith(Flux.using(...)) //actual data query
.doOnComplete(() -> {if (!initialState.get().equals(getState())) throw new RuntimeException();})
.concatWithValues("another", "constant", "data")
My questions:
Is it correct? Is it guaranteed that 1st doOnComplete lambda would be finished before Flux.using() and is it guaranteed that 2nd doOnComplete lambda would be executed strictly after?
Does more elegant solution exists?
The first doOnComplete would be executed after Flux.just("some", "constant", "data") emits all elements and the second one after emitted Publisher defined in concatWith completes successfully. This is working because both publishers have a finite number of elements.
With the proposed approach, however the pre-/postconditions from a particular operation are handled outside of the operations at a higher level. In other words, the condition check belonging to the operation is leaking to the flux definition.
Suggestion, pushing the condition check down to the operation:
var otherElements = Flux.using( // actual data query
() -> "other",
x -> {
var initialState = getState();
return Flux.just(x).doOnComplete(() ->
{ if (!initialState.equals(getState())) throw new IllegalStateException(); }
);
},
x -> { }
);
Flux.just("some", "constant", "data")
.concatWith(otherElements)
.concatWith(Mono.just("another")) // "constant", "data" ...

Vertx CompositeFuture

I am working on a solution where I am using vertx 3.8.4 and vertx-mysql-client 3.9.0 for asynchronous database calls.
Here is the scenario that I have been trying to resolve, in a proper reactive manner.
I have some mastertable records which are in inactive state.
I run a query and get the list of records from the database.
This I did like this :
Future<List<Master>> locationMasters = getInactiveMasterTableRecords ();
locationMasters.onSuccess (locationMasterList -> {
if (locationMasterList.size () > 0) {
uploadTargetingDataForAllInactiveLocations(vertx, amazonS3Utility,
locationMasterList);
}
});
Now in uploadTargetingDataForAllInactiveLocations method, i have a list of items.
What I have to do is, I need to iterate over this list, for each item, I need to download a file from aws, parse the file and insert those data to db.
I understand the way to do it using CompositeFuture.
Can someone from vertx dev community help me with this or with some documentation available ?
I did not find good contents on this by googling.
I'm answering this as I was searching for something similar and I ended up spending some time before finding an answer and hopefully this might be useful to someone else in future.
I believe you want to use CompositeFuture in vertx only if you want to synchronize multiple actions. That means that you either want an action to execute in the case that either all your other actions on which your composite future is built upon succeed or at least one of the action on which your composite future is built upon succeed.
In the first case I would use CompositeFuture.all(List<Future> futures) and in the second case I would use CompositeFuture.any(List<Future> futures).
As per your question, below is a sample code where a list of item, for each item we run an asynchronous operation (namely downloadAnProcessFile()) which returns a Future and we want to execute an action doAction() in the case that all the async actions succeeded:
List<Future> futures = new ArrayList<>();
locationMasterList.forEach(elem -> {
Promise<Void> promise = Promise.promise();
futures.add(promise.future());
Future<Boolean> processStatus = downloadAndProcessFile(); // doesn't need to be boolean
processStatus.onComplete(asyncProcessStatus -> {
if (asyncProcessStatus.succeeded()){
// eventually do stuff with the result
promise.complete();
} else {
promise.fail("Error while processing file whatever");
}
});
});
CompositeFuture.all(futures).onComplete(compositeAsync -> {
if (compositeAsync.succeeded()){
doAction(); // <-- here do what you want to do when all future complete
} else {
// at least 1 future failed
}
});
This solution is probably not perfect and I suppose can be improved but this is what I found works for me. Hopefully will work for someone else.

When exactly do we use async-await and then?

I am very confused about this. I request you to clarify the concept.
Consider the following scenarios:
Case 1:
int number = 0;
void calculate() {
number = number + 2;
print(number);
}
I know this works just fine. "2" will be printed on the terminal.
But why shouldn't I use async-await here, like this:
int number = 0;
void calculate() async {
void addition() async {
number = number + 2;
}
await addition();
print(number);
}
This seems logical to me, since print(number) should wait for number = number + 2 to finish. Why isn't this necessary? How does dart know which operation to execute first?
How is it ensured that print(number) isn't executed before number = number + 2 and "0" is printed on the terminal?
Does the sequence in which we write these operations in the function matter?
Case 2:
Consider the case where I am interacting with SQFLite database and values fetched depend on each other.
Note: number1, number2, number3 will still have values before the following function is called.
void getValues() async {
void calculate1() {
number1 = await db.getNumber1(10);
}
void calculate2() {
number2 = await db.getNumber2(number1);
}
await calculate1().then((_) async {
await calculate2().then((_) async {
number3 = await db.getNumber3(number2);
});
});
}
I have a lot of these types of functions in my app and I am doing this everywhere.
I am kind of paranoid, thinking if old values of number1and number2 are taken as a parameter in getNumber2() and getNumber3() respectively, then I'll be doomed.
async/await are just syntax sugar for the underlying Future framework. 95% of the time, they will suffice, and are preferred by the style guide.
One exception is that you may have multiple futures that you want to wait until all are complete in parallel. In that case, you'll need to use Future.wait([future1, future2, future3]), which cannot be expressed using await.
Dart is executed line by line. So when the function is called calculation will be done first then it will be printed. So you will always get 2 printed
You can see it like there is one main thread in general which is the UI thread. Any operations you are writing in this thread will be performed line by line and after completely executing one line it will move to next line.
Now suppose you have something which you know that it will take time to be computed or fully executed with either a result or error. If you will write this in the main UI thread (synchronous thread) that means you're stopping the UI of the app, which in turn makes the app to crash(Application Not Responding Error) as the operating system feels that the app has frozen but as you know this is happening because of the compute you are running in the UI thread which is taking time and the UI is waiting for it to be completely executed.
So to overcome this issue we use Asynchronous methods to compute the time taking computations like getting some data from a database which will return a value or error in "future". The main UI thread doesn't waits for the asynchronous threads. If you don't have anything to show to the user until any asynchronous task is completed you place the loading indicators for the time being.
Hope this helps!

How to test `Var`s of `scala.rx` with scalatest?

I have a method which connects to a websocket and gets stream messages from some really outside system.
The simplified version is:
def watchOrders(): Var[Option[Order]] = {
val value = Var[Option[Order]](None)
// onMessage( order => value.update(Some(order))
value
}
When I test it (with scalatest), I want to make it connect to the real outside system, and only check the first 4 orders:
test("watchOrders") {
var result = List.empty[Order]
val stream = client.watchOrders()
stream.foreach {
case Some(order) =>
result = depth :: result
if (result.size == 4) { // 1.
assert(orders should ...) // 2.
stream.kill() // 3.
}
case _ =>
}
Thread.sleep(10000) // 4.
}
I have 4 questions:
Is it the right way to check the first 4 orders? there is no take(4) method found in scala.rx
If the assert fails, the test still passes, how to fix it?
Is it the right way to stop the stream?
If the thread doesn't sleep here, the test will pass the code in case Some(order) never runs. Is there a better way to wait?
One approach you might consider to get a List out of a Var is to use the .fold combinator.
The other issue you have is dealing with the asynchronous nature of the data - assuming you really want to talk to this outside real world system in your test code (ie, this is closer to the integration test side of things), you are going to want to look at scalatest's support for async tests and will probably do something like construct a future out of a promise that you can complete when you accumulate the 4 elements in your list.
See: http://www.scalatest.org/user_guide/async_testing