socket.io for react native (sending query problems) - sockets

I'm using this library and i can connect without problems.
Usually when I have worked with sockets the code that ever i used is:
socket = io.connect(url, { query: ‘token=’ + token});
and i can see this info reading socket.request._query
Using socket.io for react native i'm trying to send params:
this.socket = new SocketIO('http://localhost:3000', { query: ‘token=’ + token});
but in socket.request._query only can see this log:
{ transport: 'polling', b64: '1' }
In the library some options are mentioned like: connectParams. But i don't know how i can see the info
Related: link

It's not pretty detailed in the repo, but connectParams is a key/value object, and furthermore the values you sent in it will be appended in the url, as shown here:
if connectParams != nil {
for (key, value) in connectParams! {
let keyEsc = key.urlEncode()!
let valueEsc = "\(value)".urlEncode()!
queryString += "&\(keyEsc)=\(valueEsc)"
}
}
>Source<
So, you should try using connectParams like this(though I'm not sure how you tried it before):
this.socket = new SocketIO('http://localhost:3000', {
connectParams: {
myAwesomeQueryStringParam: "someRandomValue"
}
});
PS: forgive me, my english is pretty bad

Related

How to return the results of a MongoDB cursor back to Actix-web for display?

I'm using Actix-web 4.0.0.rc3 along with MongoDB 2.1.0 (async mode). I've got the db integrated with Actix via app_data, but I'm trying to figure out how to display the results of a multi-document cursor.
Currently I'm using:
while let Some(doc) = cursor.try_next().await? {
// processing doc results, and building a row of data
}
Spitting out the results is easy, but I can't figure out how to get those results to display in the browser. I've been searching for an example that would explain how to do this, but haven't found a solution thus far. Using handlebars it was easy to get a field from the DB, and display those results, but the cursor of unknown size has my stumped. I've done this easily with RoR, or Phoenix/Elixir - would appreciate any suggestions (or just a gentle nudge to an example that might already exist).
Thanks
I have that example:
pub async fn get_aggregation_as_json_response(
db: &Database,
aggr_pipeline: Vec<Document>,
) -> HttpResponse {
let mut cursor = db
.collection_with_type::<ExampleModel>("examplemodel")
.aggregate(aggr_pipeline, None)
.await
.expect("Error performing aggregation on examplemodel collection.");
let mut results: Vec<Document> = Vec::new();
while let Some(result) = cursor.next().await {
match result {
Ok(document) => {
results.push(document);
}
Err(_) => {
return HttpResponse::InternalServerError().finish();
}
}
}
HttpResponse::Ok().json(results)
}
Is that what you are looking for ?

Editing My HTTP Call to Use Sockets (socket.io) to Receive Data via an Observable in my Angular 2 App

Right now I have an http get call handling data coming from an api into my Angular 2 app. Now we're switching to using sockets via socket.io. I have been using an observable to get the data, and I know I can continue to do that while using socket.io sockets. But I'm having difficulty figuring out exactly what it should look like - i.e., how I need to edit my getByCategory function call to receive the data via a socket connection. This is what my getByCategory function currently looks like in my client-side Angular service:
private _url: string = 'https://api.someurl';
getByCategory() {
return this._http.get(this._url)
.map((response:Response) => response.json())
.catch(this._errorsHandler);
}
_errorsHandler(error: Response) {
console.error(error);
return Observable.throw(error || "Server Error");
}
And, on the server side, this is what my function export looks like in our mongoDB setup (already set up to use sockets via socket.io):
exports.getByCategory = function(req, res, next) {
let skip, limit, stage, ioOnly = false;
let role='office_default';
if (_.isUndefined(req.params)) {
stage = req.stage;
skip = parseInt(req.skip) || 0;
limit = parseInt(req.limit) || 0;
role = req.role;
ioOnly=true;
}
else {
stage = req.params.stage;
skip = parseInt(req.query.skip) || 0;
limit = parseInt(req.query.limit) || 0;
role = req.query.role;
}
console.log(role);
Category[role].find({'services.workflow.status': stage}).skip(skip).limit(limit).exec(function(err, doc) {
if (err) { if (!ioOnly) { return next(err) } else { return res(err)}}
else if(doc) ((!ioOnly) ? res.json(doc) : res(doc));
else ((!ioOnly) ? res.sendStatus(204) : res(doc));
});
};
How should I edit my getByCategory function to use socket.io instead of http in my service? Do I need an emit function coming from my api to act on in my Angular 2 service - or can I just adjust my current getByCategory function to use sockets within the existing observable instead?
I thought about editing the function to look something like this:
getByStage() {
this.socket.on('getByCategory')
.map((response:Response) => response.json())
.catch(this._errorsHandler);
}
}
... but to do that I'd need the server function export to make it available via an "emit" or something similar, wouldn't I? Would it work if I did that? Am I missing something here?
If you need to work with socket connection (like socket.io), you should depend on callbacks.
So, you should set up callback functions to work with them.
A demo is given here-
import { Subject } from 'rxjs/Subject';
import { Observable } from 'rxjs/Observable';
import * as io from 'socket.io-client';
export class ChatService {
private url = 'http://localhost:5000';
private socket;
sendMessage(message){
this.socket.emit('add-message', message);
}
getMessages() {
let observable = new Observable(observer => {
this.socket = io(this.url);
this.socket.on('message', (data) => {
observer.next(data);
});
return () => {
this.socket.disconnect();
};
})
return observable;
}
}
A complete tutorial of using Angular2 with socket.io is given here.
Hope you have your answer.

Using Restangular, can I use a jsonResultsAdapterProvider when needing to override the id field?

I've got a mySql db with non-standard IDs and field names, so I was trying to use both jsonResultsAdapterProvider and setRestangularFields. Here's the code in my app.config file:
RestangularProvider.setBaseUrl(remoteServiceName);
RestangularProvider.setRestangularFields({id: 'personID'});
RestangularProvider.addResponseInterceptor(function(data, operation, what, url, response, deferred) {
if (data.error) {
return data.error;
}
var extractedData = data.result;
return jsonResultsAdapterProvider.$get().camelizeKeys(extractedData);
});
RestangularProvider.addRequestInterceptor(function(elem, operation, what, url) {
return jsonResultsAdapterProvider.$get().decamelizeKeys(elem);
});
It's all good until I try to do a put/save. When I look at the request payload within the browser dev tools, it's: {"undefined":12842} (but the url is correct, so I know the id is set) If I don't use the ResultsAdapter and change the id field to Person_ID, payload looks good, so I know I'm making the right calls to Get and Save the Restangular objects. But for what it's worth, here's the code:
$scope.tests = Restangular.all('members').getList().$object;
vm.testEdit = function () {
$scope.test = Restangular.one('members', 12842).get().then(function(test) {
var copy = Restangular.copy(test);
copy.title = 'xxxx';
copy.put(); // payload was: undefined: 12842
});
}
// I also tried customPUT...
// copy.customPUT(copy, '', {}, {'Content-Type':'application/x-www-form-urlencoded'});
I tried "fixing" the id other ways too, too. like this:
Restangular.extendModel('members', function(model) {
model.id = model.personID;
return model;
});
but that messed up the urls, causing missing ids. And I tried getIdFromElem, but it only got called for my objects created with Restangular.one(), not with Restangular.all()
Restangular.configuration.getIdFromElem = function(elem) {
console.log('custom getIdFromElem called');
if (elem.route === 'members') { // this was never true
return elem[personID];
}
};
It seems like Restangular needs to substitute 'personID' most of the time, but maybe it needs 'Person_ID' at some point during the Save? Any ideas on what I could try to get the Save working?
I finally figured it out! The problem was in my config code and in the way I was decamelizing. Because of inconsistencies in my db field names (most use underscores, but some are already camelCase), I was storing the server's original elem names in an array within the jsonResultsAdapterProvider. But since I was calling jsonResultsAdapterProvider.$get().camelizeKeys(extractedData); within the interceptors, I was reinstantiating the array each time I made a new request. So, the undefined in the PUT request was coming from my decamelizeKeys() method.
My updated config code fixed the problem:
RestangularProvider.setBaseUrl(remoteServiceName);
RestangularProvider.setRestangularFields({id: 'personID'});
var jsonAdapter = jsonResultsAdapterProvider.$get();
RestangularProvider.addResponseInterceptor(function(data, operation, what, url, response, deferred) {
if (data.error) {
return data.error;
}
var extractedData = data.result;
// return extractedData;
return jsonAdapter.camelizeKeys(extractedData);
});
RestangularProvider.addRequestInterceptor(function(elem, operation, what, url) {
return jsonAdapter.decamelizeKeys(elem);
});

Apigee push notification - one

Apigee's push notification is documented here.
http://apigee.com/docs/api-baas/content/introducing-push-notifications
I tried this with the js sdk that Apigee provides here http://apigee.com/docs/app-services/content/installing-apigee-sdk-javascript. It looks like only the client can generate a notification to itself?
But I have a scenario where I would like to push notifications to multiple clients from a nodejs job that runs once every hour. Something like this, but from the nodejs sdk not from the js sdk.
var devicePath = "devices;ql=*/notifications";
How do I do this?
As remus points out above, you can do this with the usergrid module (https://www.npmjs.com/package/usergrid).
You are basically trying to construct an API call that looks like this (sending a message by referencing a device):
https://api.usergrid.com/myorg/myapp/devices/deviceUUID/notifications?access_token= access_token_goes_here '{"payloads":{"androidDev":"Hello World!!"}}'
Or like this (sending a message by referencing a user who is connected to a device)
https://api.usergrid.com/myorg/myapp/users/fred/notifications?access_token=access_token_goes_here '{"payloads":{"androidDev":"Hello World!!"}}'
You can do this with code that looks something like this:
var options = {
method:'POST',
endpoint:'devices/deviceUUID/notifications',
body:{ 'payloads':{'androidDev':'Hello World!!'} }
};
client.request(options, function (err, data) {
if (err) {
//error - POST failed
} else {
//data will contain raw results from API call
//success - POST worked
}
});
or
var options = {
method:'POST',
endpoint:'users/fred/notifications',
body:{ 'payloads':{'androidDev':'Hello World!!'} }
};
client.request(options, function (err, data) {
if (err) {
//error - POST failed
} else {
//data will contain raw results from API call
//success - POST worked
}
});
Note: the second call, that posts to the users/username/notifications endpoint assumes that you have already made a connection between the user and their device (e.g. POST /users/fred/devices/deviceUUID).

How to stream MongoDB Query Results with nodejs?

I have been searching for an example of how I can stream the result of a MongoDB query to a nodejs client. All solutions I have found so far seem to read the query result at once and then send the result back to the server.
Instead, I would (obviously) like to supply a callback to the query method and have MongoDB call that when the next chunk of the result set is available.
I have been looking at mongoose - should I probably use a different driver?
Jan
node-mongodb-driver (the underlying layer that every mongoDB client uses in nodejs) except the cursor API that others mentioned has a nice stream API (#458). Unfortunately i did not find it documented elsewhere.
Update: there are docs.
It can be used like this:
var stream = collection.find().stream()
stream.on('error', function (err) {
console.error(err)
})
stream.on('data', function (doc) {
console.log(doc)
})
It actually implements the ReadableStream interface, so it has all the goodies (pause/resume etc)
Streaming in Mongoose became available in version 2.4.0 which appeared three months after you've posted this question:
Model.where('created').gte(twoWeeksAgo).stream().pipe(writeStream);
More elaborated examples can be found on their documentation page.
mongoose is not really "driver", it's actually an ORM wrapper around the MongoDB driver (node-mongodb-native).
To do what you're doing, take a look at the driver's .find and .each method. Here's some code from the examples:
// Find all records. find() returns a cursor
collection.find(function(err, cursor) {
sys.puts("Printing docs from Cursor Each")
cursor.each(function(err, doc) {
if(doc != null) sys.puts("Doc from Each " + sys.inspect(doc));
})
});
To stream the results, you're basically replacing that sys.puts with your "stream" function. Not sure how you plan to stream the results. I think you can do response.write() + response.flush(), but you may also want to checkout socket.io.
Here is the solution I found (please correct me anyone if thatis the wrong way to do it):
(Also excuse the bad coding - too late for me now to prettify this)
var sys = require('sys')
var http = require("http");
var Db = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Db,
Connection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Connection,
Collection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Collection,
Server = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Server;
var db = new Db('test', new Server('localhost',Connection.DEFAULT_PORT , {}));
var products;
db.open(function (error, client) {
if (error) throw error;
products = new Collection(client, 'products');
});
function ProductReader(collection) {
this.collection = collection;
}
ProductReader.prototype = new process.EventEmitter();
ProductReader.prototype.do = function() {
var self = this;
this.collection.find(function(err, cursor) {
if (err) {
self.emit('e1');
return;
}
sys.puts("Printing docs from Cursor Each");
self.emit('start');
cursor.each(function(err, doc) {
if (!err) {
self.emit('e2');
self.emit('end');
return;
}
if(doc != null) {
sys.puts("doc:" + doc.name);
self.emit('doc',doc);
} else {
self.emit('end');
}
})
});
};
http.createServer(function(req,res){
pr = new ProductReader(products);
pr.on('e1',function(){
sys.puts("E1");
res.writeHead(400,{"Content-Type": "text/plain"});
res.write("e1 occurred\n");
res.end();
});
pr.on('e2',function(){
sys.puts("E2");
res.write("ERROR\n");
});
pr.on('start',function(){
sys.puts("START");
res.writeHead(200,{"Content-Type": "text/plain"});
res.write("<products>\n");
});
pr.on('doc',function(doc){
sys.puts("A DOCUMENT" + doc.name);
res.write("<product><name>" + doc.name + "</name></product>\n");
});
pr.on('end',function(){
sys.puts("END");
res.write("</products>");
res.end();
});
pr.do();
}).listen(8000);
I have been studying mongodb streams myself, while I do not have the entire answer you are looking for, I do have part of it.
you can setup a socket.io stream
this is using javascript socket.io and socket.io-streaming available at NPM
also mongodb for the database because
using a 40 year old database that has issues is incorrect, time to modernize
also the 40 year old db is SQL and SQL doesn't do streams to my knowledge
So although you only asked about data going from server to client, I also want to get client to server in my answer because I can NEVER find it anywhere when I search and I wanted to setup one place with both the send and receive elements via stream so everyone could get the hang of it quickly.
client side sending data to server via streaming
stream = ss.createStream();
blobstream=ss.createBlobReadStream(data);
blobstream.pipe(stream);
ss(socket).emit('data.stream',stream,{},function(err,successful_db_insert_id){
//if you get back the id it went into the db and everything worked
});
server receiving stream from the client side and then replying when done
ss(socket).on('data.stream.out',function(stream,o,c){
buffer=[];
stream.on('data',function(chunk){buffer.push(chunk);});
stream.on('end',function(){
buffer=Buffer.concat(buffer);
db.insert(buffer,function(err,res){
res=insertedId[0];
c(null,res);
});
});
});
//This is the other half of that the fetching of data and streaming to the client
client side requesting and receiving stream data from server
stream=ss.createStream();
binarystring='';
stream.on('data',function(chunk){
for(var I=0;i<chunk.length;i++){
binarystring+=String.fromCharCode(chunk[i]);
}
});
stream.on('end',function(){ data=window.btoa(binarystring); c(null,data); });
ss(socket).emit('data.stream.get,stream,o,c);
server side replying to request for streaming data
ss(socket).on('data.stream.get',function(stream,o,c){
stream.on('end',function(){
c(null,true);
});
db.find().stream().pipe(stream);
});
The very last one there is the only one where I am kind of just pulling it out of my butt because I have not yet tried it, but that should work. I actually do something similar but I write the file to the hard drive then use fs.createReadStream to stream it to the client. So not sure if 100% but from what I read it should be, I'll get back to you once I test it.
P.s. anyone want to bug me about my colloquial way of talking, I'm Canadian, and I love saying "eh" come at me with your hugs and hits bros/sis' :D