Node.js Callbacks in Synchronous Functions: Passing Functions as Arguments in Sequence (Nested One Inside The Other)... And Avoid Callback Hells - callback

I didn't find many references to this specific question, so that's why I'm posting it even when it is pretty basic.
When we have to handle packages that require us to use callbacks in synchronous functions, I understand that it might be due to processes that will take long internal execution times; that is, there is no delay in the process depending on another machine, but ours. (is it right?)
So if we have a Node.js script which has, say three functions two of them with an internal latency:
File: index.js
function one() {
setTimeout(()=>{
console.log('menssage one');
},1000)
}
function two() {
setTimeout(()=>{
console.log('menssage two');
},700)
}
function three() {
console.log('menssage three');
}
one();
two();
three();
The expected output of $ node index is:
menssage three
menssage two
menssage one
But, what can we do if we want an ordered execution?
In order to solve it I tried indicating a callback as parameter of at least the functions one and two:
File: index2.js
function one(anotherFuncion) {
setTimeout(()=>{
console.log('message one');
anotherFuncion();
},1000)
}
function two(anotherFuncion) {
setTimeout(()=>{
console.log('message two');
anotherFuncion();
},700)
}
function three() {
console.log('message three');
}
// execution line here
However, index2 will only work if we indicate in the execution line something like:
two(three) or one(three) but never if we put something like: one(two(three))
What is the syntax to call in sequence these three function with callbacks nested one inside the other?

Now, I want to share one possible solution to the former question. It all depends on the arguments passed along with the callback. (Hope the community will share here more options).
The syntax that is key to this solution is the three dots: ... indicating an array of parameters of any length.
A first approach is this:
function one(aCallBack, ...args) {
setTimeout(()=>{
console.log('first message');
aCallBack(...args);
},1000)
}
function two(aCallBack, ...args) {
setTimeout(()=>{
console.log('second message');
aCallBack(...args);
},700)
}
function three() {
console.log('third message');
}
one(two, three);
The former script will return:
first message
second message
third message
Which is the desired output; however, a more general solution will allow us to set a "message" parameter. This way we can repeat the use of our own functions, called one inside another multiple times, cleanly (avoiding any callback hell):
function one(msg, aCallBack, ...args) {
setTimeout(()=>{
console.log(msg);
aCallBack(...args);
},1000)
}
function two(msg, aCallBack, ...args) {
setTimeout(()=>{
console.log(msg);
aCallBack(...args);
},700)
}
function three(msg) {
console.log(msg);
}
one('first message', two, 'second message', one, 'third message', one, 'fourth message', two, 'fifth message', three, 'sixth message');

Related

How to execute function after stream is closed in Dart/Flutter?

So basically I am using the flutter_uploader package to upload files to a server and I'd like to execute a function after the upload is complete:
final StreamSubscription<UploadTaskProgress> subscription = _uploader.progress.listen(
(e) {
print(e.progress);
},
onError: (ex, stacktrace) {
throw Exception("Something went wrong updating the file...");
},
onDone: () {
myFunction(); // won't run
},
cancelOnError: true,
);
The problem is the onDone function doesn't execute thus meaning myFunction never executes. I've done some digging and I found that onDone gets called when we close the stream but there is no such method on the subscription variable. I have not used streams much and therefore am pretty bad with them.
My question is, how can I run myFunction? once the stream is complete? I thought that onDone would get called when such is the case but I guess not.
Thank you!
I didn't used that package before but I was reading a litle bit about the package and I think you can execute your funciton inside the main block, the other ones are to handle internal processes like stopping a background job or some other external stuff like notify the error to some error monitoring tool, this is what I propose to you:
final StreamSubscription<UploadTaskProgress> subscription =
_uploader.progress.listen(
(e) {
if (e.status is UploadTaskStatus._internal(3)) {
myFunction()
}
print(e.progress);
},
onError: (ex, stacktrace) {
throw Exception("Something went wrong updating the file...");
},
cancelOnError: true,
);
Just to be clear I'm not sure of the specific implementation, is just and idea I get from the docs, seems like the event also contains an status property which has a constant for when the event is completed
https://pub.dev/documentation/flutter_uploader/latest/flutter_uploader/UploadTaskProgress/UploadTaskProgress.html
https://pub.dev/documentation/flutter_uploader/latest/flutter_uploader/UploadTaskStatus-class.html
Hope this helps you :D

how to get 'on' event listener in ibm-cloud/cloudant package?

The deprecated #cloudant/cloudant is replaced by ibm-cloud/cloudant package. In former I was using following code snippet
const feed = dummyDB.follow({ include_docs: true, since: 'now'})
feed.on('change', function (change) {
console.log(change)
})
feed.on('error', function (err) {
console.log(err)
})
feed.filter = function (doc, req) {
if (doc._deleted || doc.clusterId === clusterID) {
return true
}
return false
}
Could you share a code for which I can get feed.on event listener similar to above code in new npm package ibm-cloud/cloudant.
There isn't an event emitter for changes in the #ibm-cloud/cloudant package right now. You can emulate the behaviour by either:
polling postChanges (updating the since value after new results) and processing the response result property, which is a ChangesResult. That in turn has a results property that is an array of ChangesResultItem elements, each of which is equivalent to the change argument of the event handler function.
or
call postChangesAsStream with a feed type of continuous and process the stream returned in the response result property, each line of which is a JSON object that follows the structure of ChangesResultItem. In this case you'd also probably want to configure a heartbeat and timeouts.
In both cases you'd need to handle errors to reconnect in the event of network glitches etc.

IONIC-V3 : Wait page to pop before continue executing code

I’m iterating over a JSON that contains some rules to build my page. The loop is something like this:
flux.forEach(element => {
this.navCtrl.push(element.pageName);
});
My issue here is that I need to wait for this page to execute its action before call the next, this loop makes a stack. How can I make sort of a promise to wait the page to execute its duty before continue the loop?
Thank you all!
To solve promises in sequence, you can use reduce() as explained here.
element.reduce((promise,item) => {
return promise.then(() => {
return new Promise((resolve, reject)=> {
this.navCtrl.push(element.pageName);
resolve();
})
})
},Promise.resolve())

Folding two callbacks into one Observable

The snippet of code below is functional (in the sense that it's working ;-)), but seems lame at best and well...
Can anyone suggest a way to make this more composable or at least less ugly?
The code is based on the examples on this page:
Wrap an Existing API with RxJS
function connect() {
return rx.Observable.create(function (observer) {
mongo.connect('mongodb://127.0.1:27017/things', function(err, db) {
if(err) observer.onError(err);
observer.onNext(db);
});
}).publish().refCount();
}
function getThings(db) {
return rx.Observable.create(function (observer) {
db.collection('things').find().toArray(function(err, results) {
if(err) observer.onError(err);
observer.onNext(results);
observer.onCompleted();
});
return function () {
db.close();
};
}).publish().refCount();
}
connect().subscribe(
function (db) {
getThings(db).subscribe(console.log);
}, function (err) {
console.log(err);
}
);
In this specific example, assuming that getThings() is supposed to happen only once after connect() happens, I would change the implementation of getThings() as such:
function getThings() {
return connect()
.flatMap(function(db) {
return rx.Observable.create(function (observer) {
db.collection('things').find().toArray(function(err, results) {
if(err) observer.onError(err);
observer.onNext(results);
observer.onCompleted();
});
return function () {
db.close();
};
});
});
}
Then you can just subscribe to the getThings() stream:
getThings().subscribe(console.log);
We used flatMap to hide the the connection step inside the whole getThings(). FlatMap's documentation sounds complicated, but it isn't that complicated. It just substitutes an event from the source Observable with another future event. Explained in diagrams, it substitutes each x event with a future y event.
---x----------x------->
flatMap( x => --y--> )
------y----------y---->
In our case, x event is "connected successfully", and y is "got 'things' from database".
That said, there are a couple of different ways of doing this, depending on how the app is supposed to work. It is better to think of RxJS as "Event Bus on steroids" rather than a replacement for chainable Promises, because it really is not the latter.
Developing on RxJS is best if you model "everything that happens in the app" as streams of events. If done properly, you shouldn't see these chainable "do this, then do that, then do that", because ultimately that's an imperative paradigm, and RxJS is capable of more than that. Ideally it should be more about telling what the events are, in a declarative fashion. See this tutorial for more explanations, specially the discourse in the "Wrapping up" section. Also this gist might help.

Iced coffee script with multiple callbacks

I'm using Iced coffescript with upshot js when I am refreshing multiple data sources. The refresh method has TWo call backs one for success and one for error and I want to wait for each call to make either callback.
I can't see how to do this with idced coffescript without making an additional function. My question is - is there a more elegant way that I can defer to one of multiple callbacks?
This is the code I have currently:
refreshMe = (key, value, result) =>
value.refresh(
(success)=>
result success
,
(fail, reason, error)=>
result undefined, fail
)
#refresh = () =>
success={}
fail={}
await
for key, value of #dataSources
refreshMe key, value, defer success[key], fail[key]
This is the only way I have found to do it too. I'm using it in Backbone and wrap (for example) a model's #save function with an #icedSave:
# An IcedCoffeescript friendly version of save
icedSave: (callback) ->
#save {},
success: (model, response) -> callback(true, model, response)
error: (model, response) -> callback(false, model, response)
Here's some code I use for converting Promises .then (-> onSuccess), (-> onError) to errbacks (err, result) ->:
# You can write like this:
await value.refresh esc defer e, result
# onError - function to be called when promise rejected.
# onSuccess - function to be called when promise is fulfilled.
module.exports = esc = (onError, onSuccess) ->
util = require 'util'
return (result) ->
if util.isError result
# Always send back an error to first handler.
onError? result
else if onSuccess?
console.log onSuccess, result
# `await fn esc done, defer result`
onSuccess? result
else
# `await fn esc done`
onError? null, result
You could modify the esc function a bit to handle multiple arguments for each callback.
iced.Rendezvous lib is made explicitly for this case: return at the first of multiple callbacks. From the docs:
Here is an example that shows off the different inputs and outputs of
a Rendezvous. It does two parallel DNS lookups, and reports only when
the first returns:
hosts = [ "okcupid.com", "google.com" ];
ips = errs = []
rv = new iced.Rendezvous
for h,i in hosts
dns.resolve hosts[i], rv.id(i).defer errs[i], ips[i]
await rv.wait defer which
console.log "#{hosts[which]} -> #{ips[which]}"