Pagination of jsonb elements in postgres - postgresql

I have a table with the following format
id: uuid
versionid: uuid
data: jsonb
the sample data can be ('some uuid', 'some other uuid', '{"files": [{"name": "file1"}], "users": [{"age": 18}]}');
How can I query the data to return just a single row of
{
id: "some_uuid",
versionid: "some_uuid",
data:{
"files":[{"name": "name 1"}],
"users": [{"age": 18}]
}
}
enter code here
and be able to limit the length of the returned elements in each of the above arrays (files, and users). mainly paginating them. return some now, then some later. some implementation of limit and offset or some array index accessing.

Related

Sort by json element for jsonb data is not using index (Btree/GIN) - postgresql

I have below table in postgresql which stored JSON data in jsonb type of column
CREATE TABLE "Trial" (
id SERIAL PRIMARY KEY,
data jsonb
);
Below is the sample json structure
{
"id": "000000007001593061",
"core": {
"groupCode": "DVL",
"productType": "ZDPS",
"productGroup": "005001000"
},
"plants": [
{
"core": {
"mrpGroup": "ZMTS",
"mrpTypeDesc": "MRP",
"supLeadTime": 777
},
"storageLocation": [
{
"core": {
"storageLocation": "H050"
}
},
{
"core": {
"storageLocation": "H990"
}
},
{
"core": {
"storageLocation": "HM35"
}
}
]
}
],
"discriminator": "Material"
}
There are around 8 million records with similar kind of json data.
I have created GIN Index as well as also tried json element specific BTree index
CREATE INDEX idx_trial_data_jsonpath ON "Trial" USING GIN (data jsonb_path_ops);
Also tried B-Tree index for specific json element that I wanted to use in order by
CREATE INDEX idx_trial_data_discriminator ON "Trial" USING btree ((data ->> 'discriminator'));
But seems order by is ignoring indexes for jsonb column, below is the query and it's execution plan where I can clearly see that sequential process behind the query execution instead of any index even through created it. Need assistance if anybody knows why order by is not using GIN or B-Tree index created for JSOB column
explain analyze
Select id,data
from "Trial"
order by data->'discriminator' desc
limit 100
Execution Plan of order by query
Need assistance on order by query to use index for jsonb column
Your index does not match your query. ->> and -> are different operators. Make the index match the query or vice versa and it can be used.

Fetch second max date json object using the SQL query

I'm trying to fetch second max date json data from an json column..
Here is jsonb column
--------
value
--------
{
"id": "90909",
"records": [
{
"name":"john",
"date": "2016-06-16"
},
{
"name":"kiran",
"date": "2017-06-16"
},
{
"name":"koiy",
"date": "2018-06-16"
}
]
}
How to select the second maximum date json object..
expected output:-
{
"name":"kiran",
"date": "2017-06-16"
}
and if we have only one object inside the records means that will be the second max date
and any suggestions would also helpful..
My main suggestion would be this: If your data is structured, do not store it in a JSON. It will be much easier to work with it if you structure it as relational tables.
But anyhow, here's one way to get the second-latest-date object. First unpack the array, then sort by the date and take the second to last:
SELECT obj.*
FROM your_table, jsonb_array_elements(value->'records') obj
ORDER BY obj->'date' DESC
LIMIT 1 OFFSET 1;
value
-----------------------------------------
{"date": "2017-06-16", "name": "kiran"}
(1 row)

Is there a way to implement relay-style cursor based pagination in Posgresql?

I'm trying to find a way to implement the relay-style cursor based pagingation using Postgresql. In this scheme, I would order my results based on certain criteria on a number of columns. After I get the results, I would use the columns values for each retrieved record to build a cursor for each record.
My question is:
If I have a record with column values such as this:
[ { lastName: "Turing", age: "50", id: "100" } ]
and it was retrieved with an order such as this:
[ { lastName: "asc", age: "desc", id: "asc" } ]
If I save this criteria and ordering by which a record was retrieved, is it possible to get all the results that would come after the record in this ordering even if the given record itself was deleted?
I was thinking to use the rank(), but I think it breaks paging when records are deleted.

Find & Update partial nested collection

Assume I have a Mongo collection as such:
The general schema: There are Categories, each Category has an array of Topics, and each Topic has a Rating.
[
{CategoryName: "Cat1", ..., Topics: [{TopicName: "T1", rating: 9999, ...},
{TopicName: "T2", rating: 42, ....}]},
{CategoryName: "Cat2", ... , Topics: [...]},
...
]
In my client-side meteor code, I have two operations I'd like to execute smoothly, without any added filtering to be done: Finding, and updating.
I'm imagining the find query as follows:
.find({CategoryName: "Cat1", Topics: [{TopicName: "T1"}]}).fetch()
This will, however, return the whole document - The result I want is only partial:
[{CategoryName: "Cat1", ..., Topics: [{TopicName: "T1", rating: 9999, ...}]}]
Similarly, with updating, I'd like a query somewhat as such:
.update({CategoryName: "Cat1", Topics: [{TopicName: "T1"}]}, {$set: {Topics: [{rating: infinityyy}]}})
To only update the rating of the topic T1, and not all topics of category Cat1.
Again, I'd like to avoid any filtering, as the rest of the data should not even be sent to the client in the first place.
Thanks!
You need to amend your query to the following:
Categories.find(
{ CategoryName: 'Cat1', 'Topics.TopicName': 'T1' },
{ fields: { 'Topics.$': 1 }}, // make sure you put any other fields you want in here too
).fetch()
What this query does is searches for a Category that matches the name Cat1 and has the object with the TopicName equal to T1 inside the Topic array.
In the fields projection we are using the $ symbol to tell MongoDB to return the object that was found as part of the query, and not all the objects in the Topics array.
To update this nested object you need to use the same $ symbol:
Categories.update(
{ CategoryName: "Cat1", 'Topics.TopicName': 'T1' },
{$set: {'Topics.$.rating': 100 },
);
Hope that helps

Find holes in sequential MongoDB field

I have MongoDB collection with sequential integer field.
I need to find best approach to find "holes" in that sequence that happens due to records deletion.
For example if I have collection with these documents:
{ _id: "aab", seq: 1 ... }, { _id: "aac", seq: 2 ... }, { _id: "aad", seq: 4 ... }
The next insert I do, needs to be:
{ _id: "aae", seq: 3 ... }
May be you can create a MongoDB Stored Function to achieve this.
You can create a JavaScript Stored function which store's the "Sequence Numbers" of the document, getting deleted in some collection "StoreDeletedSequenceNumCollection".
Whenever you perform a Delete operation on the collection, make sure you call the "Stored Function " which actually stores the Sequence Number of the Document which is getting deleted in the collection "StoreDeletedSequenceNumCollection".
db.system.js.save( { _id: "DeletedSeqNum",
value: function (seqNum) {
db.StoreDeletedSequenceNumCollection.insert(
{ _id : seqNum}); return 1 ; }
}
);
When you are inserting a document, you can check if there are any sequence numbers present in the "StoreDeletedSequenceNumCollection", If present you can get the MINIMUM Sequence number from it and can use for insertion, Else if the "StoreDeletedSequenceNumCollection" is empty, then you can get the "MAXIMUM" sequence number form your actual collection and can use " MAXIMUM + 1" as a sequence number for insertion.