Notification between classes using Observable - system.reactive

I have a class with a list of users from a server.
Other classes can manipulate the list on the server e.g. call add or delete operation.
My Core-Class has a reference to the other classes which are manipulating the list on the server.
I would like to:
Make a init call in Core-Class to get the list on the beginning
The Core-Class will be notified by plugins each time the list was manipulated on the server, so the Core-Class get the list from the server again.
Notify other classes that the list was reloaded and forward the new list.
My structure
Core {
users: [];
plugin1: Plugin;
plugin2: Plugin;
//Get a new list of users from the server
loadUsers() {
userService.loadUsers.then(function (res) {
this.users = res;
})
}
}
Plugin {
//sends a request to the server to create a special user,
//depending on plugin implementation
createUser();
}
I'm only just starting of using rx. I understand the factory methods, hot vs cold observable and other basic stuff. But i can not imagine how to do it with rx in the right way.
Thanks.

My idea of reactive implementation would be:
In your UserService add the following:
var loadSubject = new Subject();
var usersObservable = loadSubject.flatMap(
Observable.fromPromise(<your http call that returns promise>)).share()
function loadUsers(){
loadSubject.next(true);
}
Then each plugin that made changes will simply call:
userService.loadUsers();
Your Core, plugins, and other classes that would like to get updated will simply do:
loadService.usersObservable.subscribe(function(usersFreshValue){
this.users = usersFreshValue;
})
Note that I used the '.share()' to avoid duplication of the data.
Possible improvements:
Instead of the loadUsers method, have each plugin call userService.loadSubject.next(true);
Use backpressure/buffering operators to avoid too frequent calls. For example debounce (CAUTION! when used incorrectly, might lead to starvation) or bufferWithTime + filter (to ensure you don't trigger on empty buffers)
Use behavioral/replay observables so that new subscribers will quickly get the latest value.

Related

Is there a fix to highlight subsequent DOM changes using a Chrome Extension that currently only reads source code (+highlights keywords) on page load? [duplicate]

Essentially I want to have a script execute when the contents of a DIV change. Since the scripts are separate (content script in the Chrome extension & webpage script), I need a way simply observe changes in DOM state. I could set up polling but that seems sloppy.
For a long time, DOM3 mutation events were the best available solution, but they have been deprecated for performance reasons. DOM4 Mutation Observers are the replacement for deprecated DOM3 mutation events. They are currently implemented in modern browsers as MutationObserver (or as the vendor-prefixed WebKitMutationObserver in old versions of Chrome):
MutationObserver = window.MutationObserver || window.WebKitMutationObserver;
var observer = new MutationObserver(function(mutations, observer) {
// fired when a mutation occurs
console.log(mutations, observer);
// ...
});
// define what element should be observed by the observer
// and what types of mutations trigger the callback
observer.observe(document, {
subtree: true,
attributes: true
//...
});
This example listens for DOM changes on document and its entire subtree, and it will fire on changes to element attributes as well as structural changes. The draft spec has a full list of valid mutation listener properties:
childList
Set to true if mutations to target's children are to be observed.
attributes
Set to true if mutations to target's attributes are to be observed.
characterData
Set to true if mutations to target's data are to be observed.
subtree
Set to true if mutations to not just target, but also target's descendants are to be observed.
attributeOldValue
Set to true if attributes is set to true and target's attribute value before the mutation needs to be recorded.
characterDataOldValue
Set to true if characterData is set to true and target's data before the mutation needs to be recorded.
attributeFilter
Set to a list of attribute local names (without namespace) if not all attribute mutations need to be observed.
(This list is current as of April 2014; you may check the specification for any changes.)
Edit
This answer is now deprecated. See the answer by apsillers.
Since this is for a Chrome extension, you might as well use the standard DOM event - DOMSubtreeModified. See the support for this event across browsers. It has been supported in Chrome since 1.0.
$("#someDiv").bind("DOMSubtreeModified", function() {
alert("tree changed");
});
See a working example here.
Many sites use AJAX/XHR/fetch to add, show, modify content dynamically and window.history API instead of in-site navigation so current URL is changed programmatically. Such sites are called SPA, short for Single Page Application.
Usual JS methods of detecting page changes
MutationObserver (docs) to literally detect DOM changes.
Info/examples:
How to change the HTML content as it's loading on the page
Performance of MutationObserver to detect nodes in entire DOM.
Lightweight observer to react to a change only if URL also changed:
let lastUrl = location.href;
new MutationObserver(() => {
const url = location.href;
if (url !== lastUrl) {
lastUrl = url;
onUrlChange();
}
}).observe(document, {subtree: true, childList: true});
function onUrlChange() {
console.log('URL changed!', location.href);
}
Event listener for sites that signal content change by sending a DOM event:
pjax:end on document used by many pjax-based sites e.g. GitHub,
see How to run jQuery before and after a pjax load?
message on window used by e.g. Google search in Chrome browser,
see Chrome extension detect Google search refresh
yt-navigate-finish used by Youtube,
see How to detect page navigation on YouTube and modify its appearance seamlessly?
Periodic checking of DOM via setInterval:
Obviously this will work only in cases when you wait for a specific element identified by its id/selector to appear, and it won't let you universally detect new dynamically added content unless you invent some kind of fingerprinting the existing contents.
Cloaking History API:
let _pushState = History.prototype.pushState;
History.prototype.pushState = function (state, title, url) {
_pushState.call(this, state, title, url);
console.log('URL changed', url)
};
Listening to hashchange, popstate events:
window.addEventListener('hashchange', e => {
console.log('URL hash changed', e);
doSomething();
});
window.addEventListener('popstate', e => {
console.log('State changed', e);
doSomething();
});
P.S. All these methods can be used in a WebExtension's content script. It's because the case we're looking at is where the URL was changed via history.pushState or replaceState so the page itself remained the same with the same content script environment.
Another approach depending on how you are changing the div.
If you are using JQuery to change a div's contents with its html() method, you can extend that method and call a registration function each time you put html into a div.
(function( $, oldHtmlMethod ){
// Override the core html method in the jQuery object.
$.fn.html = function(){
// Execute the original HTML method using the
// augmented arguments collection.
var results = oldHtmlMethod.apply( this, arguments );
com.invisibility.elements.findAndRegisterElements(this);
return results;
};
})( jQuery, jQuery.fn.html );
We just intercept the calls to html(), call a registration function with this, which in the context refers to the target element getting new content, then we pass on the call to the original jquery.html() function. Remember to return the results of the original html() method, because JQuery expects it for method chaining.
For more info on method overriding and extension, check out http://www.bennadel.com/blog/2009-Using-Self-Executing-Function-Arguments-To-Override-Core-jQuery-Methods.htm, which is where I cribbed the closure function. Also check out the plugins tutorial at JQuery's site.
In addition to the "raw" tools provided by MutationObserver API, there exist "convenience" libraries to work with DOM mutations.
Consider: MutationObserver represents each DOM change in terms of subtrees. So if you're, for instance, waiting for a certain element to be inserted, it may be deep inside the children of mutations.mutation[i].addedNodes[j].
Another problem is when your own code, in reaction to mutations, changes DOM - you often want to filter it out.
A good convenience library that solves such problems is mutation-summary (disclaimer: I'm not the author, just a satisfied user), which enables you to specify queries of what you're interested in, and get exactly that.
Basic usage example from the docs:
var observer = new MutationSummary({
callback: updateWidgets,
queries: [{
element: '[data-widget]'
}]
});
function updateWidgets(summaries) {
var widgetSummary = summaries[0];
widgetSummary.added.forEach(buildNewWidget);
widgetSummary.removed.forEach(cleanupExistingWidget);
}

Add single record to mongo collection with meteor

I am a new user to JavaScript and the meteor framework trying to understand the basic concepts. First of all I want to add a single document to a collection without duplicate entries.
this.addRole = function(roleName){
console.log(MongoRoles.find({name: roleName}).count());
if(!MongoRoles.find({name: roleName}).count())
MongoRoles.insert({name: roleName});
}
This code is called on the server as well as on the client. The log message on the client tells me there are no entries in the collection. Even if I refresh the page several times.
On the server duplicate entries get entered into the collection. I don't know why. Probably I did not understand the key concept. Could someone point it out to me, please?
Edit-1:
No, autopublish and insecure are not installed anymore. But I already published the MongoRoles collection (server side) and subscribed to it (client side). Furthermore I created a allow rule for inserts (client side).
Edit-2:
Thanks a lot for showing me the meteor method way but I want to get the point doing it without server side only methods involved. Let us say for academic purposes. ;-)
Just wrote a small example:
Client:
Posts = new Mongo.Collection("posts");
Posts.insert({title: "title-1"});
console.log(Posts.find().count());
Server:
Posts = new Mongo.Collection("posts");
Meteor.publish(null, function () {
return Posts.find()
})
Posts.allow({
insert: function(){return true}
})
If I check the server database via 'meteor mongo' it tells me every insert of my client code is saved there.
The log on the client tells me '1 count' every time I refresh the page. But I expected both the same. What am I doing wrong?
Edit-3:
I am back on my original role example (sorry for that). Just thought I got the point but I am still clueless. If I check the variable 'roleCount', 0 is responded all the time. How can I load the correct value into my variable? What is the best way to check if a document exists before the insertion into a collection? Guess the .find() is asynchronous as well? If so, how can I do it synchronous? If I got it right I have to wait for the value (synchronous) because I really relay on it.
Shared environment (client and server):
Roles = new Mongo.Collection("jaqua_roles");
Roles.allow({
insert: function(){return true}
})
var Role = function(){
this.addRole = function(roleName){
var roleCount = Roles.find({name: roleName}).count();
console.log(roleCount);
if(roleCount === 0){
Roles.insert({name: roleName}, function(error, result){
try{
console.log("Success: " + result);
var roleCount = Roles.find({name: roleName}).count();
console.log(roleCount);
} catch(error){
}
});
}
};
this.deleteRole = function(){
};
}
role = new Role();
role.addRole('test-role');
Server only:
Meteor.publish(null, function () {
return Roles.find()
})
Meteor's insert/update/remove methods (client-side) are not a great idea to use. Too many potential security pitfalls, and it takes a lot of thought and time to really patch up any holes. Further reading here.
I'm also wondering where you're calling addRole from. Assuming it's being triggered from client-side only, I would do this:
Client-side Code:
this.addRole = function(roleName){
var roleCount = MongoRoles.find({name: roleName}).count();
console.log(roleCount);
if (roleCount === 0) {
Meteor.call('insertRole', roleName, function (error, result) {
if (error) {
// check error.error and error.reason (if I'm remembering right)
} else {
// Success!
}
});
}
}
How I've modified this code and why:
I made a roleCount variable so that you can avoid calling MongoRoles.find() twice like that, which is inefficient and consumes unneeded resources (CPU, disk I/O, etc). Store it once, then reference the variable instead, much better.
When checking numbers, try to avoid doing things like if (!count). Using if (count === 0) is clearer, and shows that you're referencing a number. Statements like if (!xyz) would make one think this is a boolean (true/false) value.
Always use === in JavaScript, unless you want to intentionally do a loose equality operation. Read more on this.
Always use open/closed curly braces for if and other blocks, even if it contains just a single line of code. This is just good practice so that if you decide to add another line later, you don't have to then wrap it in braces. Just a good practice thing.
Changed your database insert into a Meteor method (see below).
Side note: I've used JavaScript (ES5), but since you're new to JavaScript, I think you should jump right into ES6. ES is short for ECMAScript (which is what JS is based on). ES6 (or ECMAScript 2015) is the most recent stable version which includes all kinds of new awesomeness that JavaScript didn't previously have.
Server-side Code:
Meteor.method('insertRole', function (roleName) {
check(roleName, String);
try {
// Any security checks, such as logged-in user, validating roleName, etc
MongoRoles.insert({name: roleName});
} catch (error) {
// error handling. just throw an error from here and handle it on client
if (badThing) {
throw new Meteor.Error('bad-thing', 'A bad thing happened.');
}
}
});
Hope this helps. This is all off the top of my head with no testing at all. But it should give you a better idea of an improved structure when it comes to database operations.
Addressing your edits
Your code looks good, except a couple issues:
You're defining Posts twice, don't do that. Make a file, for example, /lib/collections/posts.js and put the declaration and instantiation of Mongo.Collection in there. Then it will be executed on both client and server.
Your console.log would probably return an error, or zero, because Posts.insert is asynchronous on the client side. Try the below instead:
.
Posts.insert({title: "title-1"}, function (error, result) {
console.log(Posts.find().count());
});

Set execution order of event-handlers of click event in jQuery?

I am using jQuery 1.9.1.
Suppose i have a button with id="clickMe"
My jQuery code is:
$('#clickMe').click(function(event)
{
eventHandler1();//do something
eventHandler2();//use output from eventHandler1() and do something
}
Now, i want "eventHandler2" to be executed at last so that i could use the output of "eventHandler1". Is there any way to do this manually and not just the way i have put the handlers inside the click event?
One more thing, "eventHandler1()" and "eventHandler2()" are present in different .js files and thus the requirement.
jQuery.when() provides a way to execute callback functions based on one or more objects, usually Deferred objects that represent asynchronous events.
For example, when the Deferreds are jQuery.ajax() requests, the arguments will be the jqXHR objects for the requests, in the order they were given in the argument list.
$.when(eventHandler1).then(eventHandler2).done(function(){
alert('done.');
});
So can even use GLOBAL variable to store eventHandler1 output and access that inside eventHandler2
Example
var someVar;
function eventHandler1()
{
// process
someVar = some value from process
return someVar;
}
function eventHandler2()
{
alert(someVar);
}
Response to OP comment
as you have asked about execute handler in queue you can use Jai answer.
you can use .when .then and .done as below.
$.when(eventHandler1).then(eventHandler2).done(function(){
//process code
});

Navigation Property Filter

My question is this: How can you implement a default server-side "filter" for a navigation property?
In our application we seldom actually delete anything from the database. Instead, we implement "soft deletes" where each table has a Deleted bit column. If this column is true the record has been "deleted". If it is false, it has not.
This allows us to easily "undelete" records accidentally deleted by the client.
Our current ASP.NET Web API returns only "undeleted" records by default, unless a deleted argument is sent as true from the client. The idea is that the consumer of the service doesn't have to worry about specifying that they only want undeleted items.
Implementing this same functionality in Breeze is quite simple, at least for base entities. For example, here would be the implementation of the classic Todo's example, adding a "Deleted" bit field:
// Note: Will show only undeleted items by default unless you explicitly pass deleted = true.
[HttpGet]
public IQueryable<BreezeSampleTodoItem> Todos(bool deleted = false) {
return _contextProvider.Context.Todos.Where(td => td.Deleted == deleted);
}
On the client, all we need to do is...
var query = breeze.EntityQuery.from("Todos");
...to get all undeleted Todos, or...
var query = breeze.EntityQuery.from("Todos").withParameters({deleted: true})
...to get all deleted Todos.
But let's say that a BreezeSampleTodoItem has a child collection for the tools that are needed to complete that Todo. We'll call this "Tools". Tools also implements soft deletes. When we perform a query that uses expand to get a Todo with its Tools, it will return all Tools - "deleted" or not.
But how can I filter out these records by default when Todo.Tools is expanded?
It has occurred to me to have separate Web API methods for each item that may need expanded, for example:
[HttpGet]
public IQueryable<Todo> TodoAndTools(bool deletedTodos = false, bool deletedTools = false)
{
return // ...Code to get filtered Todos with filtered Tools
}
I found some example code of how to do this in another SO post, but it requires hand-coding each property of Todo. The code from the above-mentioned post also returns a List, not an IQueryable. Furthermore this requires methods to be added for every possible expansion which isn't cool.
Essentially what I'm looking for is some way to define a piece of code that gets called whenever Todos is queried, and another for whenever Tools is queried - preferably being able to pass an argument that defines if it should return Deleted items. This could be anywhere on the server-side stack - be it in the Web API method, itself, or maybe part of Entity Framework (note that filtering Include extensions is not supported in EF.)
Breeze cannot do exactly what you are asking for right now, although we have discussed the idea of allowing the filtering of "expands", but we really need more feedback as to whether the community would find this useful. Please add this to the breeze User Voice and vote for it. We take these suggestions very seriously.
Moreover, as you point out, EF does not support this.
But... what you can do is use a projection instead of an expand to do something very similar:
public IQueryable<Object> TodoAndTools(bool deleted = false
,bool deletedTools = false) {
var baseQuery = _contextProvider.Context.Todos.Where(td => td.Deleted == deleted);
return baseQuery.Select(t => new {
Todo: t,
Tools: t.Tools.Where( tool => tool.Deleted = deletedTools);
});
}
Several things to note here:
1) We are returning an IQueryable of Object instead of IQueryable of ToDo
2) Breeze will inspect the returned payload and automatically create breeze entities for any 'entityTypes' returned (even within a projection). So the result of this query will be an array of javascript objects each with two properties; 'ToDo' and 'Tools' where Tools is an array of 'Tool' entities. The nice thing is that both ToDo and Tool entities returned within the projection will be 'full' breeze entities.
3) You can still pass client side filters based on the projected property names. i.e.
var query = EntityQuery.from("TodoAndTools")
.where("Todo.Description", "startsWith", "A")
.using(em);
4) EF does support this.

GXT (Ext-GWT) + Pagination + HTTP GET

I'm trying to populate a GXT Grid using data retrieved from an online API (for instance, going to www.example.com/documents returns a JSON array of documents). In addition, I need to paginate the result.
I've read all the various blogs and tutorials, but most of them populate the pagination proxy using something like TestData.GetDocuments(). However, I want to get that info using HTTP GET.
I've managed to populate a grid, but without pagination, using a RequestBuilder + proxy + reader + loader. But it seems as though the actual loading of the data is "put off" until some hidden stage deep inside the GXT code. Pagination requires that data from the start, so I'm not sure what to do.
Can someone provide a simple code example which does what I need?
Thank you.
I managed to get this going, here is what I did:
First I defined the proxy and loader for my data along with the paging toolbat:
private PagingModelMemoryProxy proxy;
private PagingLoader<PagingLoadResult<ModelData>> loader;
private PagingToolBar toolBar;
Next is the creation of each one, initializing with an empty ArrayList.
proxy = new PagingModelMemoryProxy(new ArrayList<EquipmentModel>());
loader = new BasePagingLoader<PagingLoadResult<ModelData>>(proxy);
loader.setRemoteSort(true);
toolBar = new PagingToolBar(100);
toolBar.bind(loader);
loader.load(0, 100);
Last, I have a set method in my view that gets called when the AJAX call is complete, but you could trigger it anywhere. Here is my entire set method, Equipment and EquipmentModel are my database and view models respectively.
public void setEquipmentData(List<Equipment> data)
{
Collections.sort(data);
// build a list of models to be loaded
List<EquipmentModel> models = new ArrayList<EquipmentModel>();
for (Equipment equipment : data)
{
EquipmentModel model = new EquipmentModel(equipment);
models.add(model);
}
// load the list of models into the proxy and reconfigure the grid to
// refresh display.
proxy.setData(models);
ListStore<EquipmentModel> equipmentStore = new ListStore<EquipmentModel>(loader);
equipmentGrid.reconfigure(equipmentStore, equipmentColumnModel);
loader.load(0, 100);
}
The key here for me was re-creating the store with the same loader, the column model was pre-created and gets reused.