I have a table called user, and inside the table there is a field called friends, this field is a json type has a value as in the following example
{"blockList": {"199": {"date": 1453197190, "status": 1}, "215": {"date": 1459325611, "status": 1}, "219": {"date": 1454244074, "status": 1}, "225": {"date": 1453981312, "status": 1}, "229": {"date": 1459327685, "status": 1}}, "followers": {"211": {"date": 1452503369}, "219": {"date": 1452764627}, "334": {"date": 1456396375}}, "following": {"215": {"date": 1459325619}, "219": {"date": 1453622322}, "226": {"date": 1454244887}, "229": {"date": 1459327691}}, "friendList": {"213": {"date": 1453622410, "type": 2, "status": 1}, "214": {"date": 1452763643, "status": 1}, "215": {"date": 1455606872, "type": 2, "status": 2}, "218": {"date": 1453280047, "status": 1}, "219": {"date": 1453291227, "status": 2}, "221": {"date": 1453622410, "type": 2, "status": 1}, "224": {"date": 1453380152, "type": 2, "status": 1}, "225": {"date": 1453709357, "type": 2, "status": 2}, "226": {"date": 1454244088, "type": 2, "status": 1}, "229": {"date": 1454326745, "type": 2, "status": 2}}}
this record has a blockList object that is containning objects for blocked users.
what I need, is to return an array of all block list keys like this
["199", "215", "219", "225", "229"]
any help how can I write a plpgsql function to do that (return all object keys in an array)?
I'm a beginner in psotgresql, and need a help please.
Use json_object_keys to a set containing the outermost keys of a json object (so you'll need to select the object for the blockList key, which you can do with friends->'blockList'), and use array_agg to aggregate them into an array:
SELECT ARRAY_AGG(f)
FROM (
SELECT json_object_keys(friends->'blockList') f
FROM users
) u;
┌───────────────────────┐
│ array_agg │
├───────────────────────┤
│ {199,215,219,225,229} │
└───────────────────────┘
(1 row)
Note:
If you're using the jsonb type (and not the json one) you'll need to use the jsonb_object_keys function.
SELECT array_agg(ks) FROM (
SELECT json_object_keys(friends->'blockList') AS ks
FROM users
) x
I have created a SQL fiddle here to demonstrate.
Note: user is a reserved word, so I have called the table users.
I'm late to the party, but I would like to propose following way, borrowed from Erwin here:
SELECT ARRAY(SELECT json_object_keys(friends->'blockList')) FROM users;
Related
I'm using PostgreSQL 13 and I have a table that has a JSONB column and GIN index.
My question is to find a better way for the following query. I have multiple IDs and sometimes I need to find them.
My Schema:
id timestamp data
404599 2022-01-01 00:15:02.566 {"env": "worker", "id":1, "name": "foo", "lang": "en"}
404600 2022-01-01 00:15:02.685 {"env": "worker", "id":2, "name": "foo", "lang": "fr"}
404601 2022-01-01 00:15:02.808 {"env": "worker", "id":3, "name": "foo", "lang": "ru"}
404602 2022-01-01 00:15:03.023 {"env": "worker", "id":3, "name": "foo", "lang": "de"}
404603 2022-01-01 00:15:03.170 {"env": "worker", "id":4, "name": "foo", "lang": "tr"}
My Query:
select * from foo where data #> '{"id": 1}' or data #> '{"id": 2}' or data #> '{"id": 3}';
I want to use the GIN index for the "data" column but I couldn't find any better way that looks like the "in" operator.
That is the best way to query with a GIN index on the JSON column.
If all your queries look like that, using a b-tree index would be more efficient:
CREATE INDEX ON foo ((data ->> 'id'));
SELECT * FROM foo WHERE data ->> 'id' IN (1, 2, 3);
I have a hypothetical table with the following information about cost of vehicles, and I am trying to model the data for storing into a Expenses collection in MongoDB:
Category
Item
Cost
Land
Car
1000
Land
Motorbike
500
Air
Plane
2000
Air
Others: Rocket
5000
One assumption for this use case is that the Categorys and Items are fixed fields in the table, while users will fill in the Cost for each specific Item in the table. Should there be other vehicles in the category, users will fill them under "Others".
Currently, of 2 options to store the document:
Option 1 - as a nested object:
[
{
"category": "land",
"items": [
{"name": "Car", "cost": 1000},
{"name": "Motorbike", "cost": 500},
]
}
{
"category": "air",
"items": [
{"name": "Plane", "cost": 2000},
{"name": "Others", remarks: "Rocket", "cost": 5000},
]
}
]
Option 2 - as a flattened array, where the React application will map the array to render the data in the table:
[
{"category": "land", "item": "car", "cost": 1000},
{"category": "land", "item": "motorbike", "cost": 500},
{"category": "air", "item": "plane", "cost": 2000},
{"category": "air", "item": "others", "remarks": "rocket", "cost": 5000},
]
Was hoping to get any suggestions on which is a better approach, or if there is a better approach that you have in mind.
Thanks in advance! :)
I have a postgres 9.6 table which has a json field config. I want to fetch records from this table where the json has a particular key value pair.
My table is as follows
CREATE TABLE features(
id integer NOT NULL,
plan character,
config json NOT NULL
)
In the json field, I am storing a json in the form
[
{ "name": "A", "state": "active"},
{ "name": "B", "state": "inactive"},
{ "name": "C", "state": "active"}
]
Now, I am querying the database to fetch all the records for which the json field contains the key-value pair { "name": "B", "state": "inactive"}.
My query is as follows
select * from features where config #> '[{ "name": "B", "state": "inactive"}]';
However, I get an error
ERROR: operator does not exist: config #> unknown
Any idea where I am going wrong here. Pointers will be highly appreciated. TIA !!!
Operator #> is only available for jsonb data type:
CREATE TABLE features(
id integer NOT NULL,
plan character,
config jsonb NOT NULL
);
CREATE
insert into features values(1,'a',' [ { "name": "A", "state": "active"}, { "name": "B", "state": "inactive"}, { "name": "C", "state": "active"} ]');
INSERT 0 1
select * from features where config #> '[{ "name": "B", "state": "inactive"}]';
id | plan | config
----+------+----------------------------------------------------------------------------------------------------------
1 | a | [{"name": "A", "state": "active"}, {"name": "B", "state": "inactive"}, {"name": "C", "state": "active"}]
(1 row)
With json data type in the table, you can use:
select * from
(select json_array_elements(config)::jsonb as item from features) as setofjsonb
where item = '{"name": "B", "state": "inactive"}'::jsonb;
item
------------------------------------
{"name": "B", "state": "inactive"}
(1 row)
There are a lot of questions/answers sounding exactly like what I'm looking for but I couldn't find a single one that actually worked for me.
Sample data:
{
"_id": "5daeb61790183fd4d4361d6c",
"orderMessageId": "7563_21",
"orderId": "OS00154",
"orderEntryDate": "2019-06-17T00:00:00.000Z",
"typeOfOrder": "ORD",
"express": false,
"name1": "xxx",
"name2": "xxx",
"name3": " ",
"contact": "IN KOMMISSION",
"street": "xxx",
"city": "xxx",
"zipcode": "1235",
"country": "xx",
"customerId": "51515",
"lnatMarketCode": "Corporate - Regulatory",
"shipmentCarrier": "ABH",
"typeOfShipment": "83",
"typeOfShipmentDescr": "xxx",
"orderTextfield": " ",
"orderTextfield02": " ",
"text2": " ",
"LinVw": [
{
"orderLineMessageId": "OS05451",
"orderLineId": 5,
"articleId": "19200",
"articleDescription": "xxx",
"productId": "OS1902",
"productDescription": "xxx",
"baseQuantityUnit": "EA",
"quantityOrdered": 2,
"isbn": "978357468",
"issn": " ",
"orderSubmissionDate": "2019-06-06T00:00:00.000Z",
"customerPurchaseOrderId": "728188175",
"lnatCustomerIdAtSupplier": " ",
"supplierDeliveryNoteId": " ",
"fulfillmentContactName": "xxxx",
"customerVatRegistrationCode": "AT4151511900",
"listPriceInclVat": 21.4955,
"text": " ",
"orderResponses": [
{
"orderMessageId": "7718677_1",
"orderLineMessageId": "OS0000015451",
"orderId": "OS000154",
"orderLineId": 5,
"articleId": "1911200",
"quantity": 2,
"quantityNotShipped": 0,
"reasonForNotShippedCode": null,
"reasonForNotShipped": null,
"shipmentDate": "2019-10-04T00:00:00.000Z",
"deliveryNoteId": null,
"trackingIds": [
{
"trackingId": null,
"quantityRefToTracking": "2",
"weightRefToTracking": "0.0"
}
],
"type": "orderresponse",
"filepath": "xxxORDERRESP_20191004131209.xml",
"_id": "OS005451"
},
{
"orderMessageId": "753_21",
"orderLineMessageId": "OS015451",
"orderId": "O00154",
"orderLineId": 5,
"articleId": "100200",
"quantity": 0,
"quantityNotShipped": 2,
"reasonForNotShippedCode": "01",
"reasonForNotShipped": "Out of Stock",
"shipmentDate": null,
"deliveryNoteId": null,
"trackingIds": [
{
"trackingId": null,
"quantityRefToTracking": "0",
"weightRefToTracking": "0.0"
}
],
"type": "orderresponse",
"filepath": "xxxxORDERRESP_20190618161529.xml",
"_id": "OS0000015451"
}
]
}
],
"filepath": "xxxxxORDER_7539563_20190618_071522.xml"
}
I want to match all documents, where all documents in the array LinVw, match the following condition:
{'$or': [{'LinVw.orderResponses': {'$exists': False}}, {'LinVw.orderResponses.shipmentDate': {'$type': 10}}]}
To put it in words: I want to match documents, if the array LinVw.orderResponses doesn't exist, or it contains only documents, that don't have a valid shipmentDate.
Currently I have this (using pymongo):
result = order_collection.aggregate([
{"$unwind": "$LinVw"},
{"$match": {'$or': [{'LinVw.orderResponses': {'$exists': False}}, {'LinVw.orderResponses.shipmentDate': {'$type': 10}}]}}
])
But of course this doesn't consider that all documents inside LinVw.orderResponses should match the condition.
Most examples our there don't deal with this kind of nesting and I was unable to rewrite them accordingly.
I would appreciate any help.
You can achieve this by adding a $redact stage.
Inside $redact stage, you write down the query matching documents which you want to ignore(having invalid shipMent date). That's it.
I think I did it:
result = order_collection.aggregate([
{"$unwind": "$LinVw"},
{"$match": {'$or': [{'LinVw.orderResponses': {'$exists': False}}, {'LinVw.orderResponses.shipmentDate': {'$type': 10}}]}},
{"$match": {'LinVw.orderResponses.shipmentDate': {"$not":{'$type': 9}}}},
{"$project":{"_id":0, "LinVw.orderLineMessageId":1, "LinVw.orderResponses":1}}
])
I have the following code and I am trying to get all the hits from the elasticsearch. If I try to write without the query part it only gives me 10 results when I call .getHits.
val resultFuture = client.execute {
search in "reports/reportOutput" query{ termQuery("mainReportID", reportId.toString)}
}.await
Another issue is that the query part does not actually work and I get nothing. Here is a structure from my elasticsearch.
"hits": {
"total": 266,
"max_score": 1,
"hits":[
{
"_index": "reports",
"_type": "reportOutput",
"_id": "AUwjbAuKTetnUx12_a97",
"_score": 1,
"_source":
{
"displayName": "Classic BMW / MINI",
"model": "Cooper Clubman",
"dayInStock": "10",
"stockNumber": "Q323A",
"miles": "81093",
"interiorColorGeneric": "Black",
"year": "2009",
"trimLevel": "",
"mainReportID": "4d9e4fd3-7fdf-41c8-8c29-45c5acaf78b1",
"modelNumber": "",
"exteriorColorGeneric": "White",
"exteriorColor": "Pepper White",
"vin": "WMWML33509TX35944",
"make": "MINI",
"transmission": "A",
"exteriorColorCode": "850",
"interiorColor": "Gray/Carbon Black",
"interiorColorCode": "K8E1"
}
},
You can increase how many results are returned by setting a limit on the request, for example:
search in "index" limit 100
But the default limit of 10 is not set by elastic4s but by elasticsearch itself and you cannot change it to return all results by default.