Example of using a service singleton as data source in Angular 2 - service

I need a good example of using a service singleton as data source for my Angular 2 application.
Scenario is as following:
I have an application that is loading prices of some items from the local database (in my case MongoDB).
A few of the components need to use a service which will be the universal source of truth for item prices throughout the application. These prices can be acted upon externally: user can change currency, so they have to be recalculated, or can change the date range for which price averages will be calculated.
So I need to have a singleton service which will load upon app initialization and components need to load prices only after the service data store has been initialized with prices. Also, components need to refresh data(I guess using Observable pattern) when, say, the currency or date range has been changed. Perhaps the best way is to inject the service in the app component, so it gets initialized first?
Is there a recipe or proposed architecture for this kind of app?
I can't call some init function from each component with ngOnInit() because I want data available in multiple components. I need to know in each component when to initialize it with data from Service's data store. I need to know when data is ready.
The way I did it in Angular 1.x is to instantiate the service, and in constructor initialize data, and then when the data is initialized, emit a $rootScope event to tell all components that data is ready.
I can't find a proper recipe to do the same thing in Angular 2.

You need to create a service and define it when bootstrapping your application:
bootstrap(App, [ SingletonService ]);
This way you will have a single instance for the whole application.
If you want to initialize things, you can use it constructor. To notify other elements that use the service, you can use one or several properties of EventEmitter. This way you will be able to emit events when data are there or when something changes. Components could subscribe on these EventEmitters to be notify...

Related

Fiware-Orion: Geolocation

Hi I'm a student and I'm working for the first time on the broker. I understood how the creation of entities works and their updating through "update" queries. My question is: you can create an entity that contains variables (eg geolocation) defined with value "null" or "zero" and then initialize them with values that interest me. So as to have dynamic and non-static variables (ie that require user update)?
Or do we need interaction with the CEP to do this?
From what I have read in the fiware-orion guide when I create an entity (ex car having attributes and velocity coordinates: geopoint). The values of these 2 attributes must be set in a static way (ex: speed 100 and position coordinates 40.257, 2.187). If I understand the values of these attributes I can only update them by making an update query. So my question is:
Is it possible to update the value of the attributes that contain the position or the speed of the car in a dynamic way, ie without having to write the values from the keyboard? Or does this require the use of the CEP of orion?
If I could not explain myself more generally I would like to know if it is possible to follow the progress of a moving car without me having to add the values from the keyboard.
Thanks.
Orion Context Brokker exposes a REST-based API that (among other things) allows you to create, update and query entities. From the point of view of Orion, it doesn't matter who is the one invoking the API: it can be done manually (for instance, using Postman or curl) or can be an automated system developed by you or a third party (for instance, a software running in a sensor in the car that measures the speed and periodically sends an update using a wireless communication network).
From a client-server point of view (in the case you are familiar with these concepts), Orion takes the server of the API role and the one updating the speed (either manually or automatically) takes the role of client of the API.

Storing custom temporary data in Sitecore xDB

I am using Sitecore 8.1 with xDB enabled (MongoDB). I would like to store the user-roles of the visiting users in the xDB, so I can aggregate on these data in my reports. These roles can change over time, so one user could have one set of roles at some point in time and another set of roles at a later time.
I could go and store these user-roles as custom facets on the Contact entity, but as they may change for a user from visit to visit, I will loose historical data if I update the data in the facet every time the user log in (fx. I will not be able to tell which roles a given user had, at some given visit).
Instead, I could create a custom IElement for my facet data, and store the roles along with a timestamp saying when the given roles were registered, but this model may be hard to handle during the reporting phase, where I would need to connect the interaction data with the role-data based on timestamps every time I generate a report.
Is it possible to store these custom data in the xDB in something else than the Contact collection? Can I store custom data in the Interactions collection? There is a property called Tracker.Current.Session.Interaction.CustomValues which sounds like what I need, but if I store data here, will I be able to perform proper aggregation/reporting on the data? Any other approaches I haven't thought about?
CustomValues
Yes, the CustomValues dictionary is what I would use in your case. This dictionary will get serialized to MongoDB as a nested document of every interaction (unless the dictionary is empty).
Also note that, since CustomValues is a member of the base class Sitecore.Analytics.Model.Entity, this dictionary is available in many other data classes of xDB. For example, you can store custom values in PageData and PageEventData objects.
Since CustomValues takes an object of any class, your custom data class needs some extra things for it to be successfully saved to and subsequently loaded from MongoDB:
It has to be marked as [Serializable].
It needs to be registered in the MongoDB driver like this:
using Sitecore.Analytics.Data.DataAccess.MongoDb;
// [...]
MongoDbObjectMapper.Instance.RegisterModelExtension<YourCustomClassName>();
This needs to be done only once per application lifetime - for example, in an initialize pipeline processor.
Your own storage
Of course, you don't have to use Sitecore's API to store your custom data. So the alternative would be to manually save data to a custom MongoDB collection or an SQL table. You can then read that data in your aggregation processor, finding it by the ID of currently processed interaction.
The benefit of this approach is that you can decide where and how your data is stored. The downside is extra work of implementing and maintaining this data storage.

Dependency between data store

TL;DR
What's the best way to handle dependency between types of data that is loaded asynchronously from different backend endpoints?
Problem
My app fetches data from a backend, for each entity I have an endpoint to fetch all instances.
For example api.myserver.com/v1/users for User model and api.myserver.com/v1/things for Thing model.
This data is parsed and placed into data store objects (e.g. UserDataStore and ThingDataStore) that serve these models to the rest of the app.
Question
What should I do if the data that comes from /things depends on data that comes from /users and the fetch operations are async. In my case /things returns the id of a user that created them. This means that if /things returns before /users, then I won't have enough data to create the Thing model.
Options
Have /things return also relevant /users data nested.
This is bad because:
I'll then have multiple model instances User for the same actual user - one that came from /users and one that came nested in /things.
Increases the total payload size transferred.
In a system with some permission policy, data that is returned for /users can be different to /things, and then it'll allow partially populated models to be in the app.
Create an operational dependency between the two data stores, so that ThingsDataStore will have to wait for UserDataStore to be populated before it attempts to load its own data.
This is also bad because:
Design-wise this dependency is not welcome.
Operational-wise, it will very quickly become complicated once you throw in another data stores (e.g. dependency cycles, etc).
What is the best solution for my problem and in general?
This is obviously not platform / language dependent.
I see two possible solutions:
Late initialization of UserDataStore in ThingDataStore. You will have to allow for creation an object that is not fully valid. And you will also need to add method that will give you an information whether UserDataStore is initialized or not. Not perfect, because for some time there will exists an invalid instance.
Create some kind of proxy or maybe a buider object for ThingDataStore that will hold all information about particular thing and will create ThingDataStore object as soon as UserDataStore related with this instance will be received.
Maybe it will help you. Good luck!

Using visjs manipulation to create workflow dependencies

We are currently using visjs version 3 to map the dependencies of our custom built workflow engine. This has been WONDERFUL because it helps us to visualize the flow and find invalid or missing dependencies. What we want to do next is simplify the process of building the dependencies using the visjs manipulation feature. The idea would be that we would display a large group of nodes and allow the user to order them correctly. We then want to be able to submit that json structure back to the server for processing.
Would this be possible?
Yes, this is possible.
Vis.js dispatches various events that relate to user interactions with graph (e.g. manipulations, or position changes) for which you can add handlers that modify or store the data on change. If you use DataSets to store nodes and edges in your network, you can always use the DataSets' get() function to retrieve all elements in you handler in JSON format. Then in your handler, just use an ajax request to transmit the JSON to your server to store the entire graph in your DB or by saving the JSON as a file.
The oppposite for loading the graph: simply query the JSON from your server and inject it into the node and edge DataSets' using the set method.
You can also store the networks current options using the network's getOptions method, which returns all applied options as json.

XPages Dojo Data Grid and Custom REST Service

Can a custom REST service be used as a data source for a dojo data grid? I am needing to combine data from three different databases into one data grid. The column data will need to be sort-able. The response from the REST service looks to be correct. I have having trouble with binding the JSON data to the dojo grid columns.
Very interesting -- I tested and saw the same thing with a custom REST service -- it doesn't work when referenced as the storeComponentId of the grid.
I got it to work with the following steps:
Include two dojo modules in the page resources to set up the data store
A pass-thru script tag with code to set up a JSON data store for the grid (uses the dojo modules that the resources specify)
The grid’s store property is set to the variable set up for the data source in the tag. (storeComponentId needs an XPages component name)
Here are some snippets that show the changes:
<xp:this.resources>
<xp:dojoModule name="dojo.store.JsonRest"></xp:dojoModule>
<xp:dojoModule name="dojo.data.ObjectStore"></xp:dojoModule>
</xp:this.resources>
...
<xe:restService id="restService1" pathInfo="gridData">
...
<script>
var jsonStore = new dojo.store.JsonRest(
{target:"CURRENT_PAGE_NAME_HERE.xsp/gridData"}
);
var dataStore = dojo.data.ObjectStore({objectStore: jsonStore});
</script>
...
<xe:djxDataGrid id="djxDataGrid1" store="dataStore">
There's more information and a full sample here:
http://xcellerant.net/dojo-data-grid-33-reading-custom-rest-service/
The easiest way is to start with the extension library. There's a sample for a custom JSON-Rest service. While it pulls data from one source, it is easy to extend to pull data from more than one. I strongly suggest you watch out for all over performance.
What I would do:
create a bean that spits out the JSON to the grid
test it with one database
learn about threads in XPages and here
use one thread each for the databases, cuts down your load time
use a ConcurrentSkipListMap with a comparator so you have the initial JSON in the sort order most useful to the user (or the one from the preferences or the last run)
Memento bene: the Java Collections Framework is your friend (a sometimes difficult one).
Let us know how it goes!