How to call an OrientDB function through sails orientdb adapter? - orientdb

How to run an OrientDB function through the sails-orientdb adapter ?
There is an extension to the waterline method with the signature
.runFunction('FunctionName', args...) but I cannot get it to work with my use case.
My OrientDB function returns the friends of a given User vertex by it's id :
var db = orient.getGraph();
return db.command("sql", "select expand(unionall(outE('IsFriendsWith').inV(), inE('IsFriendsWith').outV())) from " + id);
My action in the UserController controller calls my orientDB function :
findFriends: function (req, res, next) {
console.log(req.param('id'));
User.runFunction('findFriends', '#' + req.param('id')).from('OUser').limit(20).one()
.then(function (result) {
res.json(result);
});
}
Am I missing something ?
The console.log(req.param('id')) returns this 33:288786
I get the following logs in the orientdb console :
Cannot serialize record: #-2:0{findFriends:[3]} v0 [ONetworkProtocolBinary]{db=database} Error on unmarshalling record #-2:0 (java.lang.ClassCastException: com.tinkerpop.blueprints.impls.orient.OrientVertex cannot be cast to com.orientechnologies.orient.core.record.ORecord)
java.lang.ClassCastException: com.tinkerpop.blueprints.impls.orient.OrientVertex cannot be cast to com.orientechnologies.orient.core.record.ORecord
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerStringAbstract.fieldTypeToString(ORecordSerializerStringAbstract.java:197)
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerCSVAbstract.embeddedCollectionToStream(ORecordSerializerCSVAbstract.java:843)
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerCSVAbstract.fieldToStream(ORecordSerializerCSVAbstract.java:534)
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerSchemaAware2CSV.toString(ORecordSerializerSchemaAware2CSV.java:506)
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerStringAbstract.toStream(ORecordSerializerStringAbstract.java:688)
at com.orientechnologies.orient.core.serialization.serializer.record.string.ORecordSerializerSchemaAware2CSV.toStream(ORecordSerializerSchemaAware2CSV.java:268)
at com.orientechnologies.orient.core.record.impl.ODocument.toStream(ODocument.java:2102)
at com.orientechnologies.orient.core.record.impl.ODocument.toStream(ODocument.java:714)
at com.orientechnologies.orient.server.network.protocol.binary.OBinaryNetworkProtocolAbstract.getRecordBytes(OBinaryNetworkProtocolAbstract.java:417)
at com.orientechnologies.orient.server.network.protocol.binary.OBinaryNetworkProtocolAbstract.writeRecord(OBinaryNetworkProtocolAbstract.java:432)
at com.orientechnologies.orient.server.network.protocol.binary.OBinaryNetworkProtocolAbstract.writeIdentifiable(OBinaryNetworkProtocolAbstract.java:136)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.command(ONetworkProtocolBinary.java:1226)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.executeRequest(ONetworkProtocolBinary.java:386)
at com.orientechnologies.orient.server.network.protocol.binary.OBinaryNetworkProtocolAbstract.execute(OBinaryNetworkProtocolAbstract.java:217)
at com.orientechnologies.common.thread.OSoftThread.run(OSoftThread.java:69
The error tells me that the result needs to be casted to a record ?
I get this result in the OrientDB studio with the id #33:288786 :
[
{
"#type": "d",
"#rid": "#33:288787",
"#version": 16,
"#class": "User",
"gender": false,
"mail": "1",
"no": "1",
"moto": "1",
"rel": "1",
"ori": "1",
"pass": "1",
"birthDate": null,
"online": false,
"name": "1",
"in_IsFriendsWith": [
"#23:4"
],
"#fieldTypes": "in_IsFriendsWith=g,"
}
]
I am really clueless on what to do. I just started using sails and waterline as well. Thanks in advance !

Related

How to get value from query localhost:8000/example?q={"user":"admin"} in golang

I have data from database:
[{
"name": "joseph",
"user": "admin"
},
{ "name": "george",
"user": "visitor"
},
{
"name": "thomas",
"user": "admin"
}]
I want to find with user with url, example: ../testing?q={"user":"admin"}
then result data only admin.
First of all your query string should follow format mentioned in this https://en.wikipedia.org/wiki/Query_string
// r is *http.Request
r.URL.Query().Get("user") // this will provide the value of user in the query
This should be sufficient for your use case but for more details you can refer to https://golangbyexample.com/net-http-package-get-query-params-golang

Post aggregation example query for druid in json

I am trying to use post aggregation. I have used aggregation to count the number of rows which match the given filter. Following is the post aggregation query:
{
"queryType": "groupBy",
"dataSource": "datasrc1",
"intervals": ["2020-09-16T21:15/2020-09-16T22:30"],
"pagingSpec":{ "threshold":100},
"dimensions": ["city", "zip_code", "country"],
"filter": {
"fields": [
{
"type": "selector",
"dimension": "bankId",
"value": "<bank id>"
}
]
},
"granularity": "all",
"aggregations": [
{ "type": "count", "name": "row"}
],
"postAggregations": [
{ "type": "arithmetic",
"name": "sum_rows",
"fn": "+",
"fields": [
{ "type": "fieldAccess", "fieldName": "row" }
]
}
]
}
If I remove the post aggregation part, it returns me result like:
[ {
"version" : "v1",
"timestamp" : "2020-09-16T21:15:00.000Z",
"event" : {
"city": "Sunnyvale",
"zip_code": "94085",
"country": "US",
"row" : 1
}
}, {
"version" : "v1",
"timestamp" : "2020-09-16T21:15:00.000Z",
"event" : {
"city": "Sunnyvale",
"zip_code": "94080",
"country": "US",
"row" : 1
}
}
If I add the post aggregations part, I get parser exception:
{
"error" : "Unknown exception",
"errorMessage" : "Instantiation of [simple type, class io.druid.query.aggregation.post.ArithmeticPostAggregator] value failed: Illegal number of fields[
%s], must be > 1 (through reference chain: java.util.ArrayList[0])",
"errorClass" : "com.fasterxml.jackson.databind.JsonMappingException",
"host" : null
}
I want to add all the rows (column 'row') in the response we are getting for aggregation query; and put the output in "sum_rows".
I don't understand what I am missing in post_aggregations. Any help is appreciated.
Confess that I spend most of my time in the SQL API not in the native API (!!) but I think your issue is that you're only supplying one field to your post aggregator. See these examples:
https://druid.apache.org/docs/latest/querying/post-aggregations.html#example-usage
If you need sum of rows, perhaps you need a normal aggregator to sum the row count?
The error message says the ArithmeticPostAggregator requires 2 arguments; the example code has only one. There's an example of this post aggregator at the bottom of this answer.
However...the example query doesn't have multiple numeric aggregations to perform arithmetic post-aggregation against. Maybe the goal is to "combine" the two output rows into one?
...To change the two-row result into only one with the total row count (for all database records matching the query filter and interval), removing zip_code from the dimension list would be one way.
Removing zip_code from dimensions would produce one result like this:
[
{
"version" : "v1",
"timestamp" : "2020-09-16T21:15:00.000Z",
"event" : {
"city": "Sunnyvale",
"country": "US",
"row" : 2
}
]
As you can see, by submitting a groupBy query with aggregations, Druid will do this aggregation for you dynamically (based on the dimension values in the database at the time the query is run) without needing post aggregations.
Example arithmetic post aggregator:
{
"type": "arithmetic",
"name": "my_output_sum",
"fn": "+",
"fields": [
{"fieldName": "input_addend_1", "type":"fieldAccess"},
{"fieldName": "input_addend_2", "type":"fieldAccess"}
]
}

Get all vertices having a labelname

I am using ibm graph in bluemix and new to this.
I created a graph named 'test' using the GUI provided by bluemix and uploaded the sample data 'Music Festival' provided by ibm in that graph.
Now I am trying to query all the vertices having label 'attendee' using below query.
def gt = graph.traversal();
gt.V().hasLabel("attendee");
But I am getting error as
Error: Error encountered evaluating script def gt = graph.traversal();gt.V().hasLabel("attendee"); with reason com.thinkaurelius.titan.core.TitanException: Could not find a suitable index to answer graph query and graph scans are disabled: [(~label = attendee)]:VERTEX
Not sure what I am doing wrong.
Can somebody tell where am i going wrong?
How can i get rid of this error and get the expected output?
Thanks
#Radhika, Your Gremlin query is a valid Gremlin query. However, some vendors (such as IBM Graph and Titan) chose to only allow users to start their queries with a query that is indexed.This is to make sure you get the performance of your queries. Calling hasLabel() by itself will give you the Could not find a suitable index... error as you can't create indexes for labels. What you need to do is follow this step with a step that uses a indexed property as in this query :
graph.traversal();gt.V().hasLabel("band").has("genre","pop");
An index for genre has been created in the schema for the sample music festival data as you can see below
{
"propertyKeys": [
{ "name": "name", "dataType": "String", "cardinality": "SINGLE" },
{ "name": "gender", "dataType": "String", "cardinality": "SINGLE" },
{ "name": "age", "dataType": "Integer", "cardinality": "SINGLE" },
{ "name": "genre", "dataType": "String", "cardinality": "SINGLE" },
{ "name": "monthly_listeners", "dataType": "String", "cardinality": "SINGLE" },
{ "name":"date","dataType":"String","cardinality":"SINGLE" },
{ "name":"time","dataType":"String","cardinality":"SINGLE" }
],
"vertexLabels": [
{ "name": "attendee" },
{ "name": "band" },
{ "name": "venue" }
],
"edgeLabels": [
{ "name": "bought_ticket", "multiplicity": "MULTI" },
{ "name":"advertised_to","multiplicity":"MULTI" },
{ "name":"performing_at","multiplicity":"MULTI" }
],
"vertexIndexes": [
{ "name": "vByName", "propertyKeys": ["name"], "composite": true, "unique": false },
{ "name": "vByGender", "propertyKeys": ["gender"], "composite": true, "unique": false },
{ "name": "vByGenre", "propertyKeys": ["genre"], "composite": true, "unique": false}
],
"edgeIndexes" :[
{ "name": "eByBoughtTicket", "propertyKeys": ["time"], "composite": true, "unique": false }
]
That's why the above query works and you need to do the same.
If you don't have a schema, create one. You can model it after the
one above or follow the API
doc
Create an (Vertex/Label) index for the properties that you'll start
your traversals from. In this example, Name, Gender and Genre for
vertex properties and name for the edge properties.
Call the schema
endpoint
to add your schema to your graph
It's recommended to create your schema before adding any data to
your graph so that you don't have to reindex later. That'll save you
a lot of time.
Once you create your schema, you can't modify what you created
already, but you can add new properties/indexes later on.
Look at the following code samples for Java and Nodejs for the exact code to use.
I hope that helps

Meteor MongoDB - cant insert _id with _id from API call

I am trying to call the TwitchAPI and insert some of the returned data into MongoDB. However, every time I get this error: Error: Meteor requires document _id fields to be non-empty strings or ObjectIDs.
The Twitch API response for a single stream/channel looks like this:
{
"streams": [
{
"_id": 11220687552,
"game": "League of Legends",
"viewers": 11661,
"created_at": "2014-09-30T01:10:36Z",
"_links": {
"self": "http://api.twitch.tv/kraken/streams/mushisgosu"
},
"preview": {
"small": "http://static-cdn.jtvnw.net/previews-ttv/live_user_mushisgosu-80x50.jpg",
"medium": "http://static-cdn.jtvnw.net/previews-ttv/live_user_mushisgosu-320x200.jpg",
"large": "http://static-cdn.jtvnw.net/previews-ttv/live_user_mushisgosu-640x400.jpg",
"template": "http://static-cdn.jtvnw.net/previews-ttv/live_user_mushisgosu-{width}x{height}.jpg"
},
"channel": {
"_links": {
"self": "https://api.twitch.tv/kraken/channels/mushisgosu",
"follows": "https://api.twitch.tv/kraken/channels/mushisgosu/follows",
"commercial": "https://api.twitch.tv/kraken/channels/mushisgosu/commercial",
"stream_key": "https://api.twitch.tv/kraken/channels/mushisgosu/stream_key",
"chat": "https://api.twitch.tv/kraken/chat/mushisgosu",
"features": "https://api.twitch.tv/kraken/channels/mushisgosu/features",
"subscriptions": "https://api.twitch.tv/kraken/channels/mushisgosu/subscriptions",
"editors": "https://api.twitch.tv/kraken/channels/mushisgosu/editors",
"videos": "https://api.twitch.tv/kraken/channels/mushisgosu/videos",
"teams": "https://api.twitch.tv/kraken/channels/mushisgosu/teams"
},
"background": null,
"banner": "http://static-cdn.jtvnw.net/jtv_user_pictures/mushisgosu-channel_header_image-c5c08cce281b7be3-640x125.jpeg",
"display_name": "MushIsGosu",
"game": "League of Legends",
"logo": "http://static-cdn.jtvnw.net/jtv_user_pictures/mushisgosu-profile_image-b1c8bb5fd700025e-300x300.png",
"mature": false,
"status": "CLG hi im Gosu - Challenger AD - Smurfing Master!",
"partner": true,
"url": "http://www.twitch.tv/mushisgosu",
"video_banner": "http://static-cdn.jtvnw.net/jtv_user_pictures/mushisgosu-channel_offline_image-7e3401b20cb5d739-640x360.png",
"_id": 41939266,
"name": "mushisgosu",
"created_at": "2013-03-31T21:12:14Z",
"updated_at": "2014-09-30T03:08:55Z",
"abuse_reported": null,
"delay": 60,
"followers": 318914,
"profile_banner": null,
"profile_banner_background_color": null,
"views": 25963780,
"language": "en-us"
}
}
],
"_total": 8477,
"_links": {
"self": "https://api.twitch.tv/kraken/streams?limit=1&offset=0",
"next": "https://api.twitch.tv/kraken/streams?limit=1&offset=1",
"featured": "https://api.twitch.tv/kraken/streams/featured",
"summary": "https://api.twitch.tv/kraken/streams/summary",
"followed": "https://api.twitch.tv/kraken/streams/followed"
}
}
The part of my server method that tries to insert the data
Meteor.call('getStreams', function(err, res) {
var data = res.data.streams;
console.log(data);
data.forEach(function(item) {
console.log(item._id);
Streams.insert({
_id: item._id,
title: item.channel.status,
author: item.channel.display_name,
url: item.url
});
});
});
getStreams simple defines the url to call and sets some variable. As you can see I am console logging the expected _id so I know it is returning a valid string but I am still getting the error. Currently, when I make the call I return 100 streams at a time and iterate through them to save the 4 fields above. Ideally, I would like to save each stream object as its own entry in the DB but all my attempts to do that have resulted in the same error and I also read somewhere that the version on "miniMongo" bundled with Meteor does not support inserting an array of objects in bulk...I also have read that miniMong does not support Collection.save() so sadly I think it will be more later to update the contents of each _id with the latest API call info since I cant just use .save to update and insert in the same statement.
I am not sure if it has any impact but I did try setting autoIndexId to false when creating the collection and it doesn't seem to matter:
Streams = new Meteor.Collection('streams', {autoIndexId: false});
Any insight is appreciated.
The problem is that the twitch _id is NOT a String, it appears to be a Number (I can tell by the output of your JSON : the number is not surrounded by quotes).
What I'd do is let Meteor generate its own internal Mongo IDs and store the twitch _id as a separate property instead.
Streams.insert({
twitchId: item._id,
title: item.channel.status,
author: item.channel.display_name,
url: item.url
});
You will have to retrieve the streams by twitchId instead of _id, but it's hardly a problem, right ?

Record saves, promise rejects with custom REST adapter

I'm writing an ember-data adapter for the DreamFactory services platform and am running into an issue I think is related to my adapter.
When updating an existing record the promise resulting from model.save() is ALWAYS rejected with an error of
Assertion Failed: An adapter cannot assign a new id to a record that already has an id. <App.Event311:1> had id: 1 and you tried to update it with null. This likely happened because your server returned data in response to a find or update that had a different id than the one you sent
Thing is - the request to the REST API and the response back from the REST API have the same ID!
Request (PUT)
{
"record": {
"id": "1",
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
}
Response
{
"record": [
{
"id": 1,
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
]
}
The really weird thing is the record still updates properly both in the store AND in the database!
I have a working JSBin at http://emberjs.jsbin.com/mosek/1/edit that illustrates the problem. My custom adapter is on GitHub at https://github.com/ultimatemonty/ember-data-dreamfactory-adapter. The JSBin as well as my app are using Ember 1.7.0 and ED 1.0.0-beta.9
EDIT
The JSBin is attached to my personal hosted instance of DreamFactory - I haven't done anything with it outside of allowing access from JSBin but please be gentle :)
* EDIT #2 *
The updateRecord code is accessible on GitHub at https://github.com/ultimatemonty/ember-data-dreamfactory-adapter/blob/master/lib/ember-data-dreamfactory-adapter.js#L106 but here is the full method for reference:
updateRecord: function(store, type, record) {
var data = {};
var serializer = store.serializerFor(type.typeKey);
serializer.serializeIntoHash(data, type, record);
var adapter = this;
return new Ember.RSVP.Promise(function(resolve, reject) {
// hack to make DSP send back the full object
adapter.ajax(adapter.buildURL(type.typeKey) + '?fields=*', "PUT", { data: data }).then(function(json){
// if the request is a success we'll return the same data we passed in
resolve(json);
}, function(reason){
reject(reason.responseJSON);
});
});
}
The adapter/serializer you're using is expecting you to return a response without the type in it:
{
"id": 1,
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
Example: http://emberjs.jsbin.com/tigiza/1/edit
You can see it here in the extractSingle, where it tries to wrap the payload in in another object with the type specified
EmberDreamFactoryAdapter.Serializer = DS.RESTSerializer.extend({
extractArray: function(store, primaryType, payload) {
var namespacedPayload = {};
namespacedPayload[Ember.String.pluralize(primaryType.typeKey)] = payload.record;
return this._super(store, primaryType, namespacedPayload);
},
extractSingle: function (store, primaryType, payload, recordId) {
var namespacedPayload = {};
namespacedPayload[primaryType.typeKey] = payload;
return this._super(store, primaryType, namespacedPayload, recordId);
},
Your response looks like this:
{
"record": [
{
"id": 1,
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
]
}
Then the serializer kicks in, and it looks like this:
{
event:{
"record": [
{
"id": 1,
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
]
}
}
When really, the serializer should have it looking like this:
{
event:{
"id": 1,
"title": "Sample Event",
"date": "7/20/2013",
"type": "success",
"desc": "My first sample event."
}
}
You can see from the second example, the serializer wraps it in the type, then Ember Data says, hey, give me the id, so it looks at event.id which is undefined, because it lives under event.record[0].id