Firebase transactions via REST API - rest

I find transactions (https://www.firebase.com/docs/transactions.html) to be a cool way of handling concurrency, however it seems they can only be done from clients.
The way we use Firebase is mainly by writing data from our servers and observing them on clients. Is there a way to achieve optimistic concurrency model when writing data via REST API?
Thanks!

You could utilize an update counter to make write ops work in a similar way to transactions. (I'm going to use some pseudo-code below; sorry for that but I didn't want to write out a full REST API for an example.)
For example, if I have an object like this:
{
total: 100,
update_counter: 0
}
And a write rule like this:
{
".write": "newData.hasChild('update_counter')",
"update_counter": {
".validate": "newData.val() === data.val()+1"
}
}
I could now prevent concurrent modifications by simply passing in the update_counter with each operation. For example:
var url = 'https://<INSTANCE>.firebaseio.com/path/to/data.json';
addToTotal(url, 25, function(data) {
console.log('new total is '+data.total);
});
function addToTotal(url, amount, next) {
getCurrentValue(url, function(in) {
var data = { total: in.total+amount, update_counter: in.update_counter+1 };
setCurrentValue(ref, data, next, addToTotal.bind(null, ref, amount, next));
});
}
function getCurrentValue(url, next) {
// var data = (results of GET request to the URL)
next( data );
}
function setCurrentValue(url, data, next, retryMethod) {
// set the data with a PUT request to the URL
// if the PUT fails with 403 (permission denied) then
// we assume there was a concurrent edit and we need
// to try our pseudo-transaction again
// we have to make some assumptions that permission_denied does not
// occur for any other reasons, so we might want some extra checking, fallbacks,
// or a max number of retries here
// var statusCode = (server's response code to PUT request)
if( statusCode === 403 ) {
retryMethod();
}
else {
next(data);
}
}

FYI, Firebase Realtime Database officially supports this now.
Read the blog and the docs for more info.

check out Firebase-Transactions project: https://github.com/vacuumlabs/firebase-transactions
I believe, this may be quite handy for your case, especially if you do a lot of writes from the server.
(disclaimer: I'm one of the authors)

Related

How can I catch errors in my firebase function when setting a document fails?

I have a firebase cloud function to create a user document with user data whenever a user registers. How would I return an error when the set() fails? Since this is not an http request (an I don't want to use an http request in this case) I have no response. So how would I catch errors?
export const onUserCreated = functions.region('europe-west1').auth.user().onCreate(async user => {
const privateUserData = {
phoneNumber: user.phoneNumber
}
const publicUserData = {
name: 'Nameless'
}
try
{
await firestore.doc('users').collection('private').doc('data').set(privateUserData);
}catch(error)
{
//What do I put here?
}
try
{
await firestore.doc('users').collection('public').doc('data').set(publicUserData);
}catch(error)
{
//What do I put here?
}
});
You can't "return" an error, since the client doesn't even "know" about this function running, there is nobody to respond to.
You can make a registration collection, and in your function make a document there for the current user (using the uid as the document id). In that document, you can put any information you'd like your user to know (status, errors, etc).
So your clients would have to add a listener to this document to learn about their registration.
In your particular code, I think the error is in doc('users'). I guess you meant doc('users/'+user.uid).
Your catch -block will receive errors that occur on your set -call:
try {
await firestore.doc('users').collection('public').doc('data').set(publicUserData);
} catch (error) {
// here you have the error info.
}

How to decorate Siesta request with an asynchronous task

What is the correct way to alter a Request performing an asynchronous task before the Request happens?
So any request Rn need to become transparently Tn then Rn.
A little of background here: The Task is a 3rd party SDK that dispatch a Token I need to use as Header for the original request.
My idea is to decorate the Rn, but in doing this I need to convert my Tn task into a Siesta Request I can chain then.
So I wrapped the Asynchronous Task and chained to my original request.
Thus any Rn will turn into Tn.chained { .passTo(Rn) }
In that way, this new behaviour is entirely transparent for the whole application.
The problem
Doing this my code end up crashing in a Siesta internal precondition:
precondition(completedValue == nil, "notifyOfCompletion() already called")
In my custom AsyncTaskRequest I collect the callbacks for success, failure, progress etc, in order to trigger them on the main queue when the SDK deliver the Token.
I noticed that removing all the stored callback once they are executed, the crash disappear, but honestly I didn't found the reason why.
I hope there are enough informations for some hints or suggests.
Thank you in advance.
Yes, implementing Siesta’s Request interface is no picnic. Others have had exactly the same problem — and luckily Siesta version 1.4 includes a solution.
Documentation for the new feature is still thin. To use the new API, you’ll implement the new RequestDelegate protocol, and pass your implementation to Resource.prepareRequest(using:). That will return a request that you can use in a standard Siesta request chain. The result will look something like this (WARNING – untested code):
struct MyTokenHandlerThingy: RequestDelegate {
// 3rd party SDK glue goes here
}
...
service.configure(…) {
if let authToken = self.authToken {
$0.headers["X-Auth-Token"] = authToken // authToken is an instance var or something
}
$0.decorateRequests {
self.refreshTokenOnAuthFailure(request: $1)
}
}
func refreshTokenOnAuthFailure(request: Request) -> Request {
return request.chained {
guard case .failure(let error) = $0.response, // Did request fail…
error.httpStatusCode == 401 else { // …because of expired token?
return .useThisResponse // If not, use the response we got.
}
return .passTo(
self.refreshAuthToken().chained { // If so, first request a new token, then:
if case .failure = $0.response { // If token request failed…
return .useThisResponse // …report that error.
} else {
return .passTo(request.repeated()) // We have a new token! Repeat the original request.
}
}
)
}
}
func refreshAuthToken() -> Request {
return Request.prepareRequest(using: MyTokenHandlerThingy())
.onSuccess {
self.authToken = $0.jsonDict["token"] as? String // Store the new token, then…
self.invalidateConfiguration() // …make future requests use it
}
}
}
To understand how to implement RequestDelegate, you best bet for now is to look at the new API docs directly in the code.
Since this is a brand new feature not yet released, I’d greatly appreciate a report on how it works for you and any troubles you encounter.

How can I leverage reactive extensions to do caching, without a subject?

I want to be able to fetch data from an external Api for a specific request, but when that data is returned, also make it available in the cache, to represent the current state of the application.
This solution seems to work:
var Rx = require('rx');
var cached_todos = new Rx.ReplaySubject(1);
var api = {
refresh_and_get_todos: function() {
var fetch_todos = Rx.Observable.fromCallback($.get('example.com/todos'));
return fetch_todos()
.tap(todos => cached_todos.onNext(todos));
},
current_todos: function() {
return cached_todos;
}
};
But - apparently Subjects are bad practice in Rx, since they don't really follow functional reactive programming.
What is the right way to do this in a functional reactive programming way?
It is recommended not to use Subjects because there is a tendency to abuse them to inject side-effects as you have done. They are perfectly valid to use as ways of pushing values into a stream, however their scope should be tightly constrained to avoid bleeding state into other areas of code.
Here is the first refactoring, notice that you can create the source beforehand and then your api code is just wrapping it up in a neat little bow:
var api = (function() {
var fetch_todos = Rx.Observable.fromCallback($.get('example.com/todos'))
source = new Rx.Subject(),
cached_todos = source
.flatMapLatest(function() {
return fetch_todos();
})
.replay(null, 1)
.refCount();
return {
refresh: function() {
source.onNext(null);
},
current_todos: function() {
return cached_todos;
}
};
})();
The above is alright, it maintains your current interface and side-effects and state have been contained, but we can do better than that. We can create either an extension method or a static method that accepts an Observable. We can then simplify even further to something along the lines of:
//Executes the function and caches the last result every time source emits
Rx.Observable.withCache = function(fn, count) {
return this.flatMapLatest(function() {
return fn();
})
.replay(null, count || 1)
.refCount();
};
//Later we would use it like so:
var todos = Rx.Observable.fromEvent(/*Button click or whatever*/))
.withCache(
Rx.Observable.fromCallback($.get('example.com/todos')),
1 /*Cache size*/);
todos.subscribe(/*Update state*/);

two Facebook api asyncronous calls

I have a application in which I have to
FB.api(/me)
FB.api(/me/friends).
I want once the user is authenticated to login to a new page for which I will be having window.location.href.(xx).
so my pesduocode is :
checkuserdetails()
{
FB.api(/me)
FB.api(/me/friends).
window.location.href.(xx).
}
I have alerts in the call back functions and they never get executed and the page gets redirected. as suggested by CBroe I have kept the window.location.href. in the cal back.
But that is also having issues.
its like thiis:
pseduo code:
checkuserdetails()
{
FB.api(/me)
FB.api(/me/friends){ window.location.href.(xx).}.
}
The Alerts in friends will never get executed.
pseduo code:
checkuserdetails()
{
FB.api(/me) { window.location.href.(xx).}.
FB.api(/me/friends)
}
Both alerts are executed.
my questions:
1) some times I see alerts from /me/friends getting executed first and some times in /me. I understand FB is asyncronous how to make ut sequentical...If I have lot of FB.api calls (more than 2) how where to keep the windows.location? how to control the asyncronous functionality
All API calls are asynchronous, so if you need to execute something after all of these have completed then you need to either make them execute in series (not advised for perf), or in parallell with synchronization.
For this you can use patterns such as Promises, or Futures, or something as simple as
var results = [], done = 0;
function allDone(d1, d2) {
// do something with d1 and d2
}
FB.api('/...', {..}, function(data) {
results[0] = data;
if (done === 2) {
allDone.apply(null, results);
}
});
FB.api('/...', {..}, function(data) {
results[1] = data;
if (done === 2) {
allDone.apply(null, results);
}
});
This can of course be abstracted into some helper functions..

How to stream MongoDB Query Results with nodejs?

I have been searching for an example of how I can stream the result of a MongoDB query to a nodejs client. All solutions I have found so far seem to read the query result at once and then send the result back to the server.
Instead, I would (obviously) like to supply a callback to the query method and have MongoDB call that when the next chunk of the result set is available.
I have been looking at mongoose - should I probably use a different driver?
Jan
node-mongodb-driver (the underlying layer that every mongoDB client uses in nodejs) except the cursor API that others mentioned has a nice stream API (#458). Unfortunately i did not find it documented elsewhere.
Update: there are docs.
It can be used like this:
var stream = collection.find().stream()
stream.on('error', function (err) {
console.error(err)
})
stream.on('data', function (doc) {
console.log(doc)
})
It actually implements the ReadableStream interface, so it has all the goodies (pause/resume etc)
Streaming in Mongoose became available in version 2.4.0 which appeared three months after you've posted this question:
Model.where('created').gte(twoWeeksAgo).stream().pipe(writeStream);
More elaborated examples can be found on their documentation page.
mongoose is not really "driver", it's actually an ORM wrapper around the MongoDB driver (node-mongodb-native).
To do what you're doing, take a look at the driver's .find and .each method. Here's some code from the examples:
// Find all records. find() returns a cursor
collection.find(function(err, cursor) {
sys.puts("Printing docs from Cursor Each")
cursor.each(function(err, doc) {
if(doc != null) sys.puts("Doc from Each " + sys.inspect(doc));
})
});
To stream the results, you're basically replacing that sys.puts with your "stream" function. Not sure how you plan to stream the results. I think you can do response.write() + response.flush(), but you may also want to checkout socket.io.
Here is the solution I found (please correct me anyone if thatis the wrong way to do it):
(Also excuse the bad coding - too late for me now to prettify this)
var sys = require('sys')
var http = require("http");
var Db = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Db,
Connection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Connection,
Collection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Collection,
Server = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Server;
var db = new Db('test', new Server('localhost',Connection.DEFAULT_PORT , {}));
var products;
db.open(function (error, client) {
if (error) throw error;
products = new Collection(client, 'products');
});
function ProductReader(collection) {
this.collection = collection;
}
ProductReader.prototype = new process.EventEmitter();
ProductReader.prototype.do = function() {
var self = this;
this.collection.find(function(err, cursor) {
if (err) {
self.emit('e1');
return;
}
sys.puts("Printing docs from Cursor Each");
self.emit('start');
cursor.each(function(err, doc) {
if (!err) {
self.emit('e2');
self.emit('end');
return;
}
if(doc != null) {
sys.puts("doc:" + doc.name);
self.emit('doc',doc);
} else {
self.emit('end');
}
})
});
};
http.createServer(function(req,res){
pr = new ProductReader(products);
pr.on('e1',function(){
sys.puts("E1");
res.writeHead(400,{"Content-Type": "text/plain"});
res.write("e1 occurred\n");
res.end();
});
pr.on('e2',function(){
sys.puts("E2");
res.write("ERROR\n");
});
pr.on('start',function(){
sys.puts("START");
res.writeHead(200,{"Content-Type": "text/plain"});
res.write("<products>\n");
});
pr.on('doc',function(doc){
sys.puts("A DOCUMENT" + doc.name);
res.write("<product><name>" + doc.name + "</name></product>\n");
});
pr.on('end',function(){
sys.puts("END");
res.write("</products>");
res.end();
});
pr.do();
}).listen(8000);
I have been studying mongodb streams myself, while I do not have the entire answer you are looking for, I do have part of it.
you can setup a socket.io stream
this is using javascript socket.io and socket.io-streaming available at NPM
also mongodb for the database because
using a 40 year old database that has issues is incorrect, time to modernize
also the 40 year old db is SQL and SQL doesn't do streams to my knowledge
So although you only asked about data going from server to client, I also want to get client to server in my answer because I can NEVER find it anywhere when I search and I wanted to setup one place with both the send and receive elements via stream so everyone could get the hang of it quickly.
client side sending data to server via streaming
stream = ss.createStream();
blobstream=ss.createBlobReadStream(data);
blobstream.pipe(stream);
ss(socket).emit('data.stream',stream,{},function(err,successful_db_insert_id){
//if you get back the id it went into the db and everything worked
});
server receiving stream from the client side and then replying when done
ss(socket).on('data.stream.out',function(stream,o,c){
buffer=[];
stream.on('data',function(chunk){buffer.push(chunk);});
stream.on('end',function(){
buffer=Buffer.concat(buffer);
db.insert(buffer,function(err,res){
res=insertedId[0];
c(null,res);
});
});
});
//This is the other half of that the fetching of data and streaming to the client
client side requesting and receiving stream data from server
stream=ss.createStream();
binarystring='';
stream.on('data',function(chunk){
for(var I=0;i<chunk.length;i++){
binarystring+=String.fromCharCode(chunk[i]);
}
});
stream.on('end',function(){ data=window.btoa(binarystring); c(null,data); });
ss(socket).emit('data.stream.get,stream,o,c);
server side replying to request for streaming data
ss(socket).on('data.stream.get',function(stream,o,c){
stream.on('end',function(){
c(null,true);
});
db.find().stream().pipe(stream);
});
The very last one there is the only one where I am kind of just pulling it out of my butt because I have not yet tried it, but that should work. I actually do something similar but I write the file to the hard drive then use fs.createReadStream to stream it to the client. So not sure if 100% but from what I read it should be, I'll get back to you once I test it.
P.s. anyone want to bug me about my colloquial way of talking, I'm Canadian, and I love saying "eh" come at me with your hugs and hits bros/sis' :D