syntax issue ? passing objects as parameters in neo4j cypher statement - rest

I am running a query like this using the REST API to the transaction endpoint:
{
"statements" : [{"statement":"MATCH (n)-[r]-(m) WHERE id(n) IN {diagramnodes} return [type(r),labels(m)] ",
"parameters" :{
"diagramnodes" : [28]
}}]
}
which returns the expected result:
{
"commit": "http://myserver:7474/db/data/transaction/542/commit",
"results": [
{
"columns": [
"[type(r),labels(m)]"
],
"data": [
{
"row": [
[
"CONTAINS",
[
"Sentence"
]
]
]
},
{
"row": [
[
"CONTAINS",
[
"Prologram",
"Diagram"
]
]
]
},
.......
]
}
],
"transaction": {
"expires": "Sun, 07 Sep 2014 17:50:11 +0000"
},
"errors": []
}
When adding another parameter and a filter to limit the types of rels that are returned:
{"statements": [{
"statement": "MATCH (n)-[r]-(m) WHERE id(n) IN {diagramnodes} AND [type(r),labels(m)] IN {includerels} return r ",
"parameters": {
"diagramnodes": [28],
"includerels": [
[
"CONTAINS",
[
"Prologram",
"Diagram"
]
],
[
"HAS_TARGET",
["Term"]
]
]
}
}]}
it does not return any results. Why?

I found a workaround, by concatenating the reltype and labels, and comparing it to a collection of primitive types. This is the cypher (added some CRLF to make it easier to read)
{
"statements" : [{"statement":"
MATCH (n)-[r]-(m)
WHERE id(n) IN {diagramnodes}
WITH type(r) as rtype, REDUCE(acc = '', p IN labels(m)| acc + ' '+ p) AS mlabels,m
WITH rtype+mlabels As rtypemlabels,m
WHERE rtypemlabels IN {includerels}
RETURN rtypemlabels,id(m) ",
"parameters" :{
"diagramnodes" : [28],
"includerels": ["HAS_TARGET Term","CONTAINS Sentence","CONTAINS Prologram Diagram"]
}}]
}
Note 1 : type(r)+REDUCE(acc = '', p IN labels(m)| acc + ' '+ p) does not work, you have to insert an additional WITH
Note 2 : comparing a collection of nested objects with an IN clause should be possible and remains on my wish list. ;)

IN operations very probably only work for collections of primitive values.
what you can try to do is to rewrite it into ALL(x in coll WHERE expr(x)) predicate.
for an input like:
[["CONTAINS",["Prologram","Diagram"]],
["HAS_TARGET",["Term"]]]
you can try:
ALL(entry in {includerels} WHERE type(r) = entry[0] AND ALL(l in labels(n) WHERE l in entry[1]))

You can use UNWIND on your parameters array instead of using IN. Depending on your data, you might have to use DISTINCT as well. But UNWIND works well for me.

Related

Postgresql get keys from array of objects in JSONB field

Here' a dummy data for the jsonb column
[ { "name": [ "sun11", "sun12" ], "alignment": "center", "more": "fields" }, { "name": [ "sun12", "sun13" ], "alignment": "center" }, { "name": [ "sun14", "sun15" ] }]
I want to fetch all the name keys value from jsonb array of objects...expecting output -
[ [ "sun11", "sun12" ], [ "sun12", "sun13" ], [ "sun14", "sun15" ] ]
The problem is that I'm able to fetch the name key value by giving the index like 0, 1, etc
SELECT data->0->'name' FROM public."user";
[ "sun11", "sun12" ]
But I'm not able to get all the name keys values from same array of object.I Just want to get all the keys values from the array of json object. Any help will be helpful. Thanks
demo:db<>fiddle (Final query first, intermediate steps below)
WITH data AS (
SELECT '[ { "name": [ "sun11", "sun12" ], "alignment": "center", "more": "fields" }, { "name": [ "sun12", "sun13" ], "alignment": "center" }, { "name": [ "sun14", "sun15" ] }]'::jsonb AS jsondata
)
SELECT
jsonb_agg(elems.value -> 'name') -- 2
FROM
data,
jsonb_array_elements(jsondata) AS elems -- 1
jsonb_array_elements() expands every array element into one row
-> operator gives the array for attribute name; after that jsonb_agg() puts all extracted arrays into one again.
my example
SELECT DISTINCT sub.name FROM (
SELECT
jsonb_build_object('name', p.data->'name') AS name
FROM user AS u
WHERE u.data IS NOT NULL
) sub
WHERE sub.name != '{"name": null}';

update nested arrays from flat file

Seems this question is a very popular one, but I haven't found an answer (or at least not one I was able to understand).
I'm having a flat file that I would like to store in Mongo using some nestings. Though it is relatively easy to achieve with an insert and unique content I have the need to update the content on a regular base, so would need to be able to use the update commands also.
My flat file looks as follows :
Model,Category,Organisation,CountryCode,CountryWarranty,PeriodCode,PeriodQty
Model1,Category1,Org1,Code1,2Y,201707,1
Model1,Category1,Org1,Code1,2Y,201708,2
Model1,Category1,Org1,Code1,1Y,201709,3
Model1,Category1,Org1,Code2,2Y,201707,7
Model1,Category1,Org1,Code2,2Y,201708,8
Model1,Category1,Org1,Code2,5Y,201709,7
Model1,Category1,Org2,Code3,2Y,201707,5
Model1,Category1,Org2,Code3,4Y,201708,6
Model1,Category1,Org2,Code3,2Y,201709,7
...
Model_n,Category_n,Org_n,Code_n,3Y,201802,20
and what I like to achieve is the following :
{
"_id": "Model1",
"Model_category": "Category1",
"Product_Sales": [
{
"Organisation": "Org1",
"Country": [
{
"Code": "Code1",
"Guarantee_Years": "2Y",
"Period": [
{"Code": 201707,"Qty": 1},
{"Code": 201708,"Qty": 2},
{"Code": 201709,"Qty": 3}
]
}, {
"Code": "Code2",
"Guarantee_Years": "2Y",
"Period": [
{"Code": 201707,"Qty": 7},
{"Code": 201708,"Qty": 8},
{"Code": 201709,"Qty": 7}
]
}
]
}, {
"Organisation": "Org2",
"Country": [
{
"Code": "Code3",
"Guarantee_Years": "2Y",
"Period": [
{"Code": 201707,"Qty": 5},
{"Code": 201708,"Qty": 6},
{"Code": 201709,"Qty": 7}
]
}
]
}
]
}
Below a snippet of what I tried, note that the syntax is specific to my development environment so I know it is not workable or proper mongo, but it's about the basic idea. Any example using the console will do fine for me
concat("{update: "master_Sales",
updates: [
{
q:{"_id":", %{_id},""},
u:{$addToSet: {
"Product_Sales.Organisation": "", %{org}, "",
"Product_Sales.Organisation.Country": [
-- more here but have no clue --
]
}}
, upsert: true}
]}"
)
Adding my organisations works fine, but as soon as I want to add a second level (nested within an org) it goes wrong.
So in essence I want to be able to add this flat content to my Mongo in a nested array structure, and each time one of the values is changed in the future (say the quantity is updated, or a new country is added) that the line is added / updated so I am not forced to do a full refresh and insert each time a line is modified.
What would be the best approach to deal with this?
say the quantity is updated, or a new country is added
You can try below update query in 3.6 mongo version.
For updating Qty for Organisation/Country Code/Period Code - Org1/Code1/201707
db.collection.update(
{ },
{ "$set": { "Product_Sales.$[org].Country.$[country].Period.$[period].Qty" : 2 } },
{ arrayFilters: [ { "org.organisation": "Org1" }, { "country.Code": "Code1" }, { "period.Code": 201707 } ] }
)
For adding new Country to Organisation Org2
db.collection.update(
{ },
{ "$push": { "Product_Sales.$[org].Country" : {"Code": "Code4","Guarantee_Years": "2Y"} } },
{ arrayFilters: [ { "org.organisation": "Org2"} ] }
)
Can be simplified to
db.collection.update(
{ "org.organisation": "Org2"},
{ "$push": { "Product_Sales.$.Country" : {"Code": "Code4","Guarantee_Years": "2Y"} } }
)

mongodb check regex on fields from one collection to all fields in other collection

After digging google and SO for a week I've ended up asking the question here. Suppose there are two collections,
UsersCollection:
[
{...
name:"James"
userregex: "a|regex|str|here"
},
{...
name:"James"
userregex: "another|regex|string|there"
},
...
]
PostCollection:
[
{...
title:"a string here ..."
},
{...
title: "another string here ..."
},
...
]
I need to get all users whose userregex will match any post.title(Need user_id, post_id groups or something similar).
What I've tried so far:
1. Get all users in collection, run regex on all products, works but too dirty! it'll have to execute a query for each user
2. Same as above, but using a foreach in Mongo query, it's the same as above but only Database layer instead of application layer
I searched alot for available methods such as aggregations, upwind etc with no luck.
So is it possible to do this in Mongo? Should i change my database type? if yes what type would be good? performance is my first priority. Thanks
It is not possible to reference the regex field stored in the document in the regex operator inside match expression.
So it can't be done in mongo side with current structure.
$lookup works well with equality condition. So one alternative ( similar to what Nic suggested ) would be update your post collection to include an extra field called keywords ( array of keyword values it can be searched on ) for each title.
db.users.aggregate([
{$lookup: {
from: "posts",
localField: "userregex",
foreignField: "keywords",
as: "posts"
}
}
])
The above query will do something like this (works from 3.4).
keywords: { $in: [ userregex.elem1, userregex.elem2, ... ] }.
From the docs
If the field holds an array, then the $in operator selects the
documents whose field holds an array that contains at least one
element that matches a value in the specified array (e.g. ,
, etc.)
It looks like earlier versions ( tested on 3.2 ) will only match if array have same order, values and length of arrays is same.
Sample Input:
Users
db.users.insertMany([
{
"name": "James",
"userregex": [
"another",
"here"
]
},
{
"name": "John",
"userregex": [
"another",
"string"
]
}
])
Posts
db.posts.insertMany([
{
"title": "a string here",
"keyword": [
"here"
]
},
{
"title": "another string here",
"keywords": [
"another",
"here"
]
},
{
"title": "one string here",
"keywords": [
"string"
]
}
])
Sample Output:
[
{
"name": "James",
"userregex": [
"another",
"here"
],
"posts": [
{
"title": "another string here",
"keywords": [
"another",
"here"
]
},
{
"title": "a string here",
"keywords": [
"here"
]
}
]
},
{
"name": "John",
"userregex": [
"another",
"string"
],
"posts": [
{
"title": "another string here",
"keywords": [
"another",
"here"
]
},
{
"title": "one string here",
"keywords": [
"string"
]
}
]
}
]
MongoDB is good for your use case but you need to use a approach different from current one. Since you are only concerned about any title matching any post, you can store the last results of such a match. Below is a example code
db.users.find({last_post_id: {$exists: 0}}).forEach(
function(row) {
var regex = new RegExp(row['userregex']);
var found = db.post_collection.findOne({title: regex});
if (found) {
post_id = found["post_id"];
db.users.updateOne({
user_id: row["user_id"]
}, {
$set :{ last_post_id: post_id}
});
}
}
)
What it does is that only filters users which don't have last_post_id set, searches post records for that and sets the last_post_id if a record is found. So after running this, you can return the results like
db.users.find({last_post_id: {$exists: 1}}, {user_id:1, last_post_id:1, _id:0})
The only thing you need to be concerned about is a edit/delete to an existing post. So after every edit/delete, you should just run below, so that all matches for that post id are run again.
post_id_changed = 1
db.users.updateMany({last_post_id: post_id_changed}, {$unset: {last_post_id: 1}})
This will make sure that next time you run the update these users are processed again. The approach does have one drawback that for every user without a matching title, the query for such users would run again and again. Though you can workaround that by using some timestamps or post count check
Also you should make to sure to put index on post_collection.title
I was thinking that if you pre-tokenized your post titles like this:
{
"_id": ...
"title": "Another string there",
"keywords": [
"another",
"string",
"there"
]
}
but unfortunately $lookup requires that foreignField is a single element, so my idea of something like this will not work :( But maybe it will give you another idea?
db.Post.aggregate([
{$lookup: {
from: "Users",
localField: "keywords",
foreignField: "keywords",
as: "users"
}
},
]))

Does MongoDB support GIS geofencing with $geoWithin?

I would like to have a MongoDB collection and each document contains a geospatial polygon defined by latitude/longitude points (in GeoJSON). Then, I would like to take any given longitude/latitude point and check if it resides within any of the MongoDB polygons defined in the documents. Hypothetically, this is what the documents would look like.
{
"type" : "congressional",
"points" : [
{ "coords" : [
-141.0205,
70.0187 ]
},
...
{ "coords" : [
-141.0205,
70.0187 ]
}
]
}
Or maybe like so:
{ loc :
{ type : "Polygon" ,
coordinates : [ [ [ 0 , 0 ] , [ 3 , 6 ] , [ 6 , 1 ] , [ 0 , 0 ] ] ]
} }
And then I would query it, hypothetically, like so (most likely with Mongo's $geoWithin):
db.places.find( { loc : { $geoWithin : { $geometry : "EACH DOCUMENT IN COLLECTION"} } } )
Is geofencing, or something similar, possible to do with the current MongoDB feature-set? If so, how would it be done?
I believe you would have to first find every document from the collection., and then make a "$geoWithin" query for every document, passing in the Polygon to test against for each case.
Depending on the number of documents in your collection, that may or may not provide sufficient performance.
MongoDB has full support for geofencing or finding documents whose geometry intersects with a given geometry (point, polygon). The query below is an example. geometry of Collection is the field containing the geometry.
Document Example:
{ "geometry": {
"type": "Polygon",
"coordinates": [
[
[
-74.001487,
40.692346
],
[
-74.001755,
40.692057
],
[
-74.000977,
40.691951
],
[
-74.000827,
40.692297
],
[
-74.001487,
40.692346
]
]
]
}
}
JS Query:
//find quests bots that matches the users location
await Collection.find({ geometry:
{ $geoIntersects:
{
{
type: "Point",
coordinates: [
-73.99460599999999,
40.7347229
]
}
}
}
});

Get a list of all unique tags in mongodb

I am beginning with mongodb and have a collection with documents that look like the following
{
"type": 1,
"tags": ["tag1", "tag2", "tag3"]
}
{
"type": 2,
"tags": ["tag2", "tag3"]
}
{
"type": 3,
"tags": ["tag1", "tag3"]
}
{
"type": 1,
"tags": ["tag1", "tag4"]
}
With this, I want a set of all the tags for a particular type. For example, for type 1, I want the set of tag1, tag2, tag3, tag4 (any order).
All I could think of is to get the tags and add them to a set in python, but I wanted to know if there is a way to do it with mongodb's mapreduce or something else. Please advise.
If you just want a (distinct) list of the tags then using distinct will be best. Map/Reduce will be slower and can't use an index for the javascript part.
http://docs.mongodb.org/manual/reference/method/db.collection.distinct/
db.coll.distinct("tags", {type:1}) Will return a set of tags for type=1.
You are right, a Map/Reduce might work for what you are trying to accomplish, but a Set might be faster and less code.
> m = function() {
... for (var tag in this.tags) {
... emit(this.tags[tag], 1);
... }
... }
> r = function(key, values) {
... return 1;
... }
> db.tags.mapReduce(m, r).find()
{ "_id" : "tag1", "value" : 1 }
{ "_id" : "tag2", "value" : 1 }
{ "_id" : "tag3", "value" : 1 }