How to fetch data from Fiware Cygnus? - fiware-cygnus

I have setup Cygnus with MySQL agent and now data is storing into MySQL server.
As a next step, we need to show reports of historical data (Saved in Cygnus_MySQL) in GUI. It may be based upon some filters like data for today, last week/month, area wise etc.
My questions are
Do we need to create our own APIs for fetching data from MySQL and returning to GUI application?
Or is there any GUI component (WireCloud or any other) in which we can directly fetch Cygnus Data? Or we need to create our custom GUI ourselves?
I found a component STH-Comet (https://github.com/telefonicaid/fiware-sth-comet), will it fit in our requirements for fetching Cygnus_MySQL data?
Do let me know if Fiware have any other components matching with our requirement.
Regards,
Krishan

Related

Talend - Insert data into DB using Django REST API

I am trying utilise Django REST APIs to insert data into the database, instead of the direct write. I've been able to read JSON data using the tRESTClient component but I am not too sure about the insertion/POST. Could someone point me to the components (and relation) that I should use?
The current job that I have is mostly:
Read data from raw file -> tMap -> DB
and I wish to do something like:
Read data from raw file -> tMap -> (pass on data to REST endpoint via POST)
Used the tRestClient component after my tMap and I could see the records getting inserted into the DB but all of them are without any data. Strangely nowhere I was asked to specify the JASON tree. The number of records getting inserted are equal to rows being read from raw file so at least something is right. But I couldn't locate the menu/options to specify which data element read from the raw file should tag to which JASON element.
How do I specify the data to JSON mapping?
PS: I realise that this might not be the most efficient way to ingest data but that's what the business wants since it brings in an additional layer of control.

Storing custom temporary data in Sitecore xDB

I am using Sitecore 8.1 with xDB enabled (MongoDB). I would like to store the user-roles of the visiting users in the xDB, so I can aggregate on these data in my reports. These roles can change over time, so one user could have one set of roles at some point in time and another set of roles at a later time.
I could go and store these user-roles as custom facets on the Contact entity, but as they may change for a user from visit to visit, I will loose historical data if I update the data in the facet every time the user log in (fx. I will not be able to tell which roles a given user had, at some given visit).
Instead, I could create a custom IElement for my facet data, and store the roles along with a timestamp saying when the given roles were registered, but this model may be hard to handle during the reporting phase, where I would need to connect the interaction data with the role-data based on timestamps every time I generate a report.
Is it possible to store these custom data in the xDB in something else than the Contact collection? Can I store custom data in the Interactions collection? There is a property called Tracker.Current.Session.Interaction.CustomValues which sounds like what I need, but if I store data here, will I be able to perform proper aggregation/reporting on the data? Any other approaches I haven't thought about?
CustomValues
Yes, the CustomValues dictionary is what I would use in your case. This dictionary will get serialized to MongoDB as a nested document of every interaction (unless the dictionary is empty).
Also note that, since CustomValues is a member of the base class Sitecore.Analytics.Model.Entity, this dictionary is available in many other data classes of xDB. For example, you can store custom values in PageData and PageEventData objects.
Since CustomValues takes an object of any class, your custom data class needs some extra things for it to be successfully saved to and subsequently loaded from MongoDB:
It has to be marked as [Serializable].
It needs to be registered in the MongoDB driver like this:
using Sitecore.Analytics.Data.DataAccess.MongoDb;
// [...]
MongoDbObjectMapper.Instance.RegisterModelExtension<YourCustomClassName>();
This needs to be done only once per application lifetime - for example, in an initialize pipeline processor.
Your own storage
Of course, you don't have to use Sitecore's API to store your custom data. So the alternative would be to manually save data to a custom MongoDB collection or an SQL table. You can then read that data in your aggregation processor, finding it by the ID of currently processed interaction.
The benefit of this approach is that you can decide where and how your data is stored. The downside is extra work of implementing and maintaining this data storage.

Using visjs manipulation to create workflow dependencies

We are currently using visjs version 3 to map the dependencies of our custom built workflow engine. This has been WONDERFUL because it helps us to visualize the flow and find invalid or missing dependencies. What we want to do next is simplify the process of building the dependencies using the visjs manipulation feature. The idea would be that we would display a large group of nodes and allow the user to order them correctly. We then want to be able to submit that json structure back to the server for processing.
Would this be possible?
Yes, this is possible.
Vis.js dispatches various events that relate to user interactions with graph (e.g. manipulations, or position changes) for which you can add handlers that modify or store the data on change. If you use DataSets to store nodes and edges in your network, you can always use the DataSets' get() function to retrieve all elements in you handler in JSON format. Then in your handler, just use an ajax request to transmit the JSON to your server to store the entire graph in your DB or by saving the JSON as a file.
The oppposite for loading the graph: simply query the JSON from your server and inject it into the node and edge DataSets' using the set method.
You can also store the networks current options using the network's getOptions method, which returns all applied options as json.

Make Orion fetch data from Cosmos and publish

I have set up a subscription between Orion ContextBroker and Cosmos BigData using Cygnus, and data is properly persisted in Cosmos when an update is made to Orion.
But I want to analyze the data in Cosmos and return the results to Orion, and finally access the result data in Orion from "outside".
How would one do this? Of course, I would like the solution I build to be as "automated" as possible, but mostly I just want to solve this problem.
Any advise is much appreciated!
As general response (as also the question is very general ;), what you need is a process that access to information stored in Cosmos (either using HDFS APIs -such as WebHDFS or HttpFs-, Hive queries, general MapReduce jobs on top of Hadoop, etc.), then implement the client side of the NGSI API that Orion implements in order to inject context elements into Orion based in the information you retrieved from Cosmos. The key operation to do so in the Orion API is updateContext.
The automation degree would depend on how you implement that process. It can be as automated as you want.
EDIT: considering this answer comments, I will try to add more detail.
What I mean is to develop a piece of software (let's call it APOS -A Piece Of Software) implementing the following behaviour:
APOS will grab data from Cosmos any of the interfaces provided by Cosmos, i.e. WebHDFS/HttpFs, Hive, MapReduce jobs, etc.
APOS will process the data to produce some result
APOS will inject that result in Orion, using the Orion REST API described in the Orion user manual. It is particularly useful for that task the updateContext operation. From a client-server point of view, Orion is a server exposing a REST API and APOS is the client interacting with that server.
It is completely up to you how to implement this APOS and how orchestrate the flow from 1 to 3 (e.g. it can run in batch mode all midnights, be triggered by user interaction on a web portal, etc.).
At the present moment, FI-WARE doesn't provide any generic enabler to convert from Cosmos data to NGSI given that each particular realization of the steps 1 to 3 above is different and depends on the use case. However, note that there is software component named Cygnus which implements the other way: from NGIS to Cosmos.

XPages Dojo Data Grid and Custom REST Service

Can a custom REST service be used as a data source for a dojo data grid? I am needing to combine data from three different databases into one data grid. The column data will need to be sort-able. The response from the REST service looks to be correct. I have having trouble with binding the JSON data to the dojo grid columns.
Very interesting -- I tested and saw the same thing with a custom REST service -- it doesn't work when referenced as the storeComponentId of the grid.
I got it to work with the following steps:
Include two dojo modules in the page resources to set up the data store
A pass-thru script tag with code to set up a JSON data store for the grid (uses the dojo modules that the resources specify)
The grid’s store property is set to the variable set up for the data source in the tag. (storeComponentId needs an XPages component name)
Here are some snippets that show the changes:
<xp:this.resources>
<xp:dojoModule name="dojo.store.JsonRest"></xp:dojoModule>
<xp:dojoModule name="dojo.data.ObjectStore"></xp:dojoModule>
</xp:this.resources>
...
<xe:restService id="restService1" pathInfo="gridData">
...
<script>
var jsonStore = new dojo.store.JsonRest(
{target:"CURRENT_PAGE_NAME_HERE.xsp/gridData"}
);
var dataStore = dojo.data.ObjectStore({objectStore: jsonStore});
</script>
...
<xe:djxDataGrid id="djxDataGrid1" store="dataStore">
There's more information and a full sample here:
http://xcellerant.net/dojo-data-grid-33-reading-custom-rest-service/
The easiest way is to start with the extension library. There's a sample for a custom JSON-Rest service. While it pulls data from one source, it is easy to extend to pull data from more than one. I strongly suggest you watch out for all over performance.
What I would do:
create a bean that spits out the JSON to the grid
test it with one database
learn about threads in XPages and here
use one thread each for the databases, cuts down your load time
use a ConcurrentSkipListMap with a comparator so you have the initial JSON in the sort order most useful to the user (or the one from the preferences or the last run)
Memento bene: the Java Collections Framework is your friend (a sometimes difficult one).
Let us know how it goes!