Editing My HTTP Call to Use Sockets (socket.io) to Receive Data via an Observable in my Angular 2 App - mongodb

Right now I have an http get call handling data coming from an api into my Angular 2 app. Now we're switching to using sockets via socket.io. I have been using an observable to get the data, and I know I can continue to do that while using socket.io sockets. But I'm having difficulty figuring out exactly what it should look like - i.e., how I need to edit my getByCategory function call to receive the data via a socket connection. This is what my getByCategory function currently looks like in my client-side Angular service:
private _url: string = 'https://api.someurl';
getByCategory() {
return this._http.get(this._url)
.map((response:Response) => response.json())
.catch(this._errorsHandler);
}
_errorsHandler(error: Response) {
console.error(error);
return Observable.throw(error || "Server Error");
}
And, on the server side, this is what my function export looks like in our mongoDB setup (already set up to use sockets via socket.io):
exports.getByCategory = function(req, res, next) {
let skip, limit, stage, ioOnly = false;
let role='office_default';
if (_.isUndefined(req.params)) {
stage = req.stage;
skip = parseInt(req.skip) || 0;
limit = parseInt(req.limit) || 0;
role = req.role;
ioOnly=true;
}
else {
stage = req.params.stage;
skip = parseInt(req.query.skip) || 0;
limit = parseInt(req.query.limit) || 0;
role = req.query.role;
}
console.log(role);
Category[role].find({'services.workflow.status': stage}).skip(skip).limit(limit).exec(function(err, doc) {
if (err) { if (!ioOnly) { return next(err) } else { return res(err)}}
else if(doc) ((!ioOnly) ? res.json(doc) : res(doc));
else ((!ioOnly) ? res.sendStatus(204) : res(doc));
});
};
How should I edit my getByCategory function to use socket.io instead of http in my service? Do I need an emit function coming from my api to act on in my Angular 2 service - or can I just adjust my current getByCategory function to use sockets within the existing observable instead?
I thought about editing the function to look something like this:
getByStage() {
this.socket.on('getByCategory')
.map((response:Response) => response.json())
.catch(this._errorsHandler);
}
}
... but to do that I'd need the server function export to make it available via an "emit" or something similar, wouldn't I? Would it work if I did that? Am I missing something here?

If you need to work with socket connection (like socket.io), you should depend on callbacks.
So, you should set up callback functions to work with them.
A demo is given here-
import { Subject } from 'rxjs/Subject';
import { Observable } from 'rxjs/Observable';
import * as io from 'socket.io-client';
export class ChatService {
private url = 'http://localhost:5000';
private socket;
sendMessage(message){
this.socket.emit('add-message', message);
}
getMessages() {
let observable = new Observable(observer => {
this.socket = io(this.url);
this.socket.on('message', (data) => {
observer.next(data);
});
return () => {
this.socket.disconnect();
};
})
return observable;
}
}
A complete tutorial of using Angular2 with socket.io is given here.
Hope you have your answer.

Related

Rx.Net - Publish method missing first few items when subscribing to Cold Observable

Inspired by Akavache I am trying to create a solution that provides me with an IObservable<IArticle>. The method essentially first try to get all the articles that are present in the database, then it tries to fetch updated articles from the webservice and as it is getting the latest articles from webservice it tries to save them back to the database.
Since the webservice is essentially a cold observable and I don't want to subscribe twice, I used Publish to connect to it. My understanding is that I am using the correct version of the Publish method, however, many times the method tend to miss first couple of Articles from the GetNewsArticles. This was observed through the UI and also the Trace calls added in the call below.
Apart from solving the problem, it would be great to also understand how to debug/test this code (apart from introducing DI to inject NewsService).
public IObservable<IArticle> GetContents(string newsUrl, IScheduler scheduler)
{
var newsService = new NewsService(new HttpClient());
scheduler = scheduler ?? TaskPoolScheduler.Default;
var fetchObject = newsService
.GetNewsArticles(newsUrl,scheduler)
.Do(x => Trace.WriteLine($"Parsing Articles {x.Title}"));
return fetchObject.Publish(fetchSubject =>
{
var updateObs = fetchSubject
.Do( x =>
{
// Save to database, all sync calls
})
.Where(x => false)
.Catch(Observable.Empty<Article>());
var dbArticleObs = Observable.Create<IArticle>(o =>
{
return scheduler.ScheduleAsync(async (ctrl, ct) =>
{
using (var session = dataBase.GetSession())
{
var articles = await session.GetArticlesAsync(newsUrl, ct);
foreach (var article in articles)
{
o.OnNext(article);
}
}
o.OnCompleted();
});
});
return
dbArticleObs // First get all the articles from dataBase cache
.Concat(fetchSubject // Get the latest articles from web service
.Catch(Observable.Empty<Article>())
.Merge(updateObs)) // Update the database with latest articles
.Do(x => Trace.WriteLine($"Displaying {x.Title}"));
});
}
UPDATE - Added GetArticles
public IObservable<IContent> GetArticles(string feedUrl, IScheduler scheduler)
{
return Observable.Create<IContent>(o =>
{
scheduler = scheduler ?? DefaultScheduler.Instance;
scheduler.ScheduleAsync(async (ctrl, ct) =>
{
try
{
using (var inputStream = await Client.GetStreamAsync(feedUrl))
{
var settings = new XmlReaderSettings
{
IgnoreComments = true,
IgnoreProcessingInstructions = true,
IgnoreWhitespace = true,
Async = true
};
//var parsingState = ParsingState.Channel;
Article article = null;
Feed feed = null;
using (var reader = XmlReader.Create(inputStream, settings))
{
while (await reader.ReadAsync())
{
ct.ThrowIfCancellationRequested();
if (reader.IsStartElement())
{
switch (reader.LocalName)
{
...
// parsing logic goes here
...
}
}
else if (reader.LocalName == "item" &&
reader.NodeType == XmlNodeType.EndElement)
{
o.OnNext(article);
}
}
}
o.OnCompleted();
}
}
catch (Exception e)
{
o.OnError(e);
}
});
return Disposable.Empty;
});
}
UPDATE 2
Sharing the link to source code here.
There's a few things I don't like about your code. I assume NewsService is an IDisposable as it takes an HttpClient (which is disposable). You're not doing a proper clean up.
Also, you haven't provided a complete method - because you've tried cutting it down for the question - but that makes it hard to reason about how to rewrite the code.
That said, the one thing that sticks out to me as quite horrid looking is the Observable.Create. Can you please try this code instead and see if it helps things work for you?
var dbArticleObs =
Observable
.Using(
() => dataBase.GetSession(),
session =>
from articles in Observable.FromAsync(ct => session.GetArticlesAsync(newsUrl, ct))
from article in articles
select article);
Now, if that does, try rewriting fetchObject to use the same Observable.Using when newing up the `NewService.
In any case, it would be good if you could provide a complete implementation of GetContents, NewsService and your dataBase code in your question.

How to connect socket.io and rethinkdb?

After hours of trying. I haven't make it work.
Here's what I have.
var app = require('express')(),
http = require('http').Server(app),
io = require('socket.io')(http),
r = require('rethinkdb');
http.listen(5000);
console.log('Server started on port 5000');
r.connect({db: 'testRealtime'}).then(function(c) {
r.table('messages').insert(
{ message: "realtime" }
)
r.table('messages').changes().run(c)
.then(function(cursor) {
cursor.each(function(err, item) {
io.emit('messages', item)
})
})
})
As you can see on the above example. I am trying to insert a message realtime and look at it on rethinkdb dashboard. But this doesn't work. I don't know why.
Rethinkdb query r.db('testRealtime').table('messages').changes()
Since i'm using angular2. Here's the Service I created
import * as io from 'socket.io-client'
export class ChatService {
private url = 'http://localhost:5000'
private socket;
getMessages() {
this.socket = io(this.url);
this.socket.on('messages', function(data){
console.log(data.new_val.query_engine)
})
}
}
On my component, I just call the getMessages form service. Nothing to worry about angular code. I think it is more about the connection of socket.io and rethinkdb.
Any help would be appreciated. Thanks.

DotNetNuke Service API Authorization throwing 401 Unauthorized code

I am having a bit of difficulty figuring out why I am getting 401 Unauthorized status from service framework. At the moment I have it setup to allow everyone to do as they please but that because when I try to enable authorisation I get 401 error code.
//[SupportedModules("Boards")]
//[DnnModuleAuthorize(AccessLevel = SecurityAccessLevel.View)]
[AllowAnonymous]
public class BoardsServiceController : DnnApiController
{ ... }
The strange thing is I have another module which is more than happy to work away with DnnModuleAuthorize
[SupportedModules("Assignments")]
[DnnModuleAuthorize(AccessLevel = SecurityAccessLevel.View)]
public class AsgnsServiceController : DnnApiController
{ ... }
In both cases I have checked to make sure the user has permissions to view the page on which the module lives.
I have cross referenced both projects and everything seems to be spot on. Yet one is working away just fine and the other one returns 401.
Any suggestions?
Update
For Assignments module I am mostly using jQuery style of ajax request just because I haven't got around to revising the module. So a typical GET request would look something like this:
$.ajax({
type: "GET",
url: sf.getServiceRoot( "Assignments" ) + "AsgnsService/GetAssignments",
data: data,
beforeSend: sf.setModuleHeaders
}).done( function ( items ) {
//removed for brevity
}).fail( function ( xhr, result, status ) {
//removed for brevity
});
As for Boards module the code structure is slightly different due knockout implementation. There is a dedicated ServiceCaller but it all boils down to the same ajax call to the server except that instead of having full blown ajax call defined as above it looks much neater.
var that = this;
that.serviceCaller = new dnn.boards.ServiceCaller($, this.moduleId, 'BoardsService');
var success = function (model) {
if (typeof model !== "undefined" && model != null) {
viewModel = new boardViewModel(model.colLists);
ko.bindingHandlers.sortable.beforeMove = viewModel.verifyAssignments;
ko.bindingHandlers.sortable.afterMove = viewModel.updateLastAction;
// normally, we apply moduleScope as a second parameter
ko.applyBindings(viewModel, settings.moduleScope);
}
//console.log('success', model);
};
var failure = function (response, status) {
console.log('request failure: ' + status);
};
var params = {
BoardId: this.boardId
};
that.serviceCaller.get('GetBoardLists', params, success, failure);
And the ServiceCaller ajax function itself looks like this:
function (httpMethod, method, params, success, failure, synchronous) {
var options = {
url: that.getRoot() + method,
beforeSend: that.services.setModuleHeaders,
type: httpMethod,
async: synchronous == false,
success: function (d) {
if (typeof (success) != 'undefined') {
success(d || {});
}
},
error: function (xhr, textStatus, errorThrown) {
if (typeof (failure) != 'undefined') {
var message = undefined;
if (xhr.getResponseHeader('Content-Type').indexOf('application/json') == 0) {
try {
message = $.parseJSON(xhr.responseText).Message;
} catch (e) {
}
}
failure(xhr, message || errorThrown);
}
}
};
if (httpMethod == 'GET') {
options.data = params;
} else {
options.contentType = 'application/json; charset=utf-8';
options.data = ko.toJSON(params);
options.dataType = 'json';
}
$.ajax(options);
};
This would be the two GET requests from two different modules where one is happy and the other throws a status 401 when I enable the same annotations.
Does this provide any clues?
Update
Now in saying all of the above if one takes a look at the original Boards module code base one will notice [DnnAuthorize] annotation attached to every function.
During module revision I removed all instances of [DnnAuthorize] annotation and replaced it with two of my own on the service class itself.
When I add [DnnAuthorize] as annotation on service class itself things work as expected. So why [SupportedModules("Boards")] and [DnnModuleAuthorize(AccessLevel = SecurityAccessLevel.View)] combination doesn't !?
I am not sure but working with the WebAPI you have to register the Service Framework anti forgery stuff
ServicesFramework.Instance.RequestAjaxAntiForgerySupport();
This is part of asking the API to work with a specific module.

Meteor code must always run within a fiber when deploy in meteor server

I kept having this error when i deploy my app onto meteor cloud server.
Meteor code must always run within a Fiber
at _.extend.get (app/packages/meteor/dynamics_nodejs.js:14:13)
at _.extend.apply (app/packages/livedata/livedata_server.js:1268:57)
at _.extend.call (app/packages/livedata/livedata_server.js:1229:17)
at Meteor.startup.Meteor.methods.streamTwit (app/server/server.js:50:24)
however, I have already wrapped within Fibers
streamTwit: function (twit){
var userid = '1527228696';
twit.stream(
'statuses/filter',
{ follow: userid},
function(stream) {
stream.on('data', function(tweet) {
Fiber(function(){
if(tweet.user.id_str === userid)
{
Meteor.call('addQn', tweet);
}
}).run();
console.log(tweet);
console.log('---------------------------------------------------------');
console.log(tweet.user.screen_name);
console.log(tweet.user.name);
console.log(tweet.text);
});
}
);
}
I don't know what's the reason but someone suggested that i should wrap it with Meteor.bindEnvironment instead. Hence, I did this:
streamTwit: function (twit){
this.unblock(); // this doesn't seem to work
console.log('... ... trackTweets');
var _this = this;
var userid = '1527228696';
twit.stream(
'statuses/filter',
{ follow: userid},
function(stream) {
stream.on('data', function(tweet) {
Meteor.bindEnvironment(function () {
if(tweet.user.id_str === userid)
{
Meteor.call('addQn', tweet);
}
}, function(e) {
Meteor._debug("Exception from connection close callback:", e);
});
console.log(tweet);
console.log('---------------------------------------------------------');
console.log(tweet.user.screen_name);
console.log(tweet.user.name);
console.log(tweet.text);
});
}
);
}
//add question method
addQn:function(tweet){
questionDB.insert({'tweet': tweet, 'date': new Date()});
}
but now it doesn't even work. I realise that this only happened when I tried to insert some data into mongodb.
May I know what is the problem with my code? Thanks!
All these codes were written in app/server/server.js
You shouldn't need to use Meteor.call on the server side. That is for client-side code only. Just call addQn directly or better yet, inline it since it's just one line of code.

How to get Meteor.Call to return value for template?

I've tried to understand this post regarding this concept, however, I'm failing to get it. I have the following simple setup:
/server/test.js
Meteor.methods({
abc: function() {
var result = {};
result.foo = "Hello ";
result.bar = "World!";
return result;
}
});
/client/myapp.js
var q = Meteor.call('abc');
console.log(q);
This structure returns to the console undefined.
If I change the myapp.js file to:
Meteor.call('abc', function(err, data) {
!err ? console.log(data) : console.log(err);
}
I receive the Object in my console.
Ideally this is what I'd like to be able to do, but it doesn't work, stating in the console: Cannot read property 'greeting' of undefined
/client/myapp.js
var q = Meteor.call('abc');
Template.hello.greeting = function() {
return q.foo;
}
Any help in passing the data from the server object into the template would be greatly appreciated. I'm still learning JavaScript & Meteor.
Thanks!
From the Meteor.call documentation:
On the client, if you do not pass a callback and you are not inside a stub, call will return undefined, and you will have no way to get the return value of the method. That is because the client doesn't have fibers, so there is not actually any way it can block on the remote execution of a method.
So, you'll want to do it like this:
Meteor.call('abc', function(err, data) {
if (err)
console.log(err);
Session.set('q', data);
});
Template.hello.greeting = function() {
return Session.get('q').foo;
};
This will reactively update the template once the data is available.
This happens because Npm.require has Async behavior. That's the reason that you have to write a callback for Meteor.call.
But there is a solution, just use install(mrt add npm) and you'll get a function named Meteor.sync(//...) with this you can do both games: sync and async in your Meteor.call().
Reference: http://www.sitepoint.com/create-a-meteor-app-using-npm-module/
You can get the return value of a Meteor method for use in a template by using a reactive variable. Check out the working demonstration on Meteorpad
I went for a ghetto solution. But, it works for me, which is what matters, to me. Below is my code, which, in concept, I think, solves OP's problem.
In the client's main.js:
Meteor.setInterval(function() {
confirmLogin();
}, 5000);
This runs the confirmLogin() function every five seconds.
The confirmLogin function (in the client's main.js):
function confirmLogin() {
Meteor.call('loggedIn', function (error, result) {
Session.set("loggedIn", result);
});
}
The loggedIn method (in the server's main.js):
loggedIn: function () {
var toReturn = false;
var userDetails = Meteor.user();
if (typeof userDetails["services"] !== "undefined") {
if (typeof userDetails["services"]["facebook"] != "undefined") {
toReturn = true;
}
}
return toReturn;
},
The relevant helper:
loggedIn: function () {
return Session.get("loggedIn");
}