I have the structure like,
select to_json('[{
"11111":
{
"Number":{"11111"},
"createdTime":"2018-06-25 10:30:11.047 +0530",
"errorMessage":"invalid"
}
}]')
If I try to convert to json structure, I am getting the following error:
ERROR: could not determine polymorphic type because input has type unknown
I need to get a valid json format.
Thanks..
to_json is used to convert e.g. a record or other values that are not a JSON value to proper a JSON value.
You apparently want to use a string value as a JSON.
In order to be able to do that, you need to supply valid JSON. The part "Number":{"11111"} is however invalid JSON, you need to remove the curly braces.
select '[{
"11111":
{
"Number": "11111",
"createdTime":"2018-06-25 10:30:11.047 +0530",
"errorMessage":"invalid"
}
}]'::json
But why are you using a JSON array if you only have a single value? From what you have shown a single JSON value would make more sense:
select '{
"11111":
{
"Number": "11111",
"createdTime":"2018-06-25 10:30:11.047 +0530",
"errorMessage":"invalid"
}
}'::json
Related
Well, I have used MongoDB in a while but I don't know how to handle this situation.
The scenario is: I have data inserted in MongoDB. I have exported the data in JSON format, and it is something like:
[
{
"_id": { "$oid": "60ff324f41c4d5b96054390d" },
"field": {
"due_date": { "$date": "2021-11-03T00:09:18.271Z" }
}
}
]
You can see that :
_id is { "$oid": "60ff324f41c4d5b96054390d" } and
date is { "$date": "2021-11-03T00:09:18.271Z" }.
So, the problem is trying to insert in another DB using Mongoose. I want to test some funcionality isolated so I wan't these values in other environment, so I have used insertMany() with the JSON previously exported.
(Maybe there is a more elegant way to import a JSON file but is only for testing purposes)
await model.insertMany(JSON.parse(fs.readFileSync('./data.json').toString()))
But the problem is here: It throws an error because my schema says that _id is an ObjectId and due_date is a Date object but they are actually read as objects: {$oid: ""} and {$date: ""}
ValidationError: model validation failed: _id: Cast to ObjectId failed for value "{ '$oid': '60ff324f41c4d5b96054390d' }" (type Object) at path "_id", field.due_date: Cast to date failed for value "{ '$date': '2021-11-03T00:09:18.271Z' }" (type Object) at path "field.due_date"
So the question is: Is there any way to cast the $oid and $date objects using mongoose while inserting?
Also, I have managed to insert the values using a very ugly script iterating over each value and checking if the object is $oid or $date and modifying the values... but it works.
So the question is not: "Is there a workaround to do this?" but "Is there a way to do it directly with mongoose functions as insertMany and casted automatically?"
Thanks.
The {"$oid": ...} and {"$date":...} constructs are MongoDB extended JSON notation, since JSON does not have any type for Date or ObjectId.
In order to insert those values, you will need to use a JSON parser that knows about this extended format.
One possibility is the json_util that is included with bson.
Or you can use mongoimport to read the JSON file and do the inserts for you.
I am trying to use a Lookup Activity to return a row count. I am able to do this, but once I do, I would like to run an If Statement against it and if the count returns more than 20MIL in rows, I want to execute an additional pipeline for further table manipulation. The issue, however, is that I can not compare the returned value to a static integer. Below is the current Dynamic Expression I have for this If Statement:
#greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output),20000000)
and when fired, the following error is returned:
{
"errorCode": "InvalidTemplate",
"message": "The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type",
"failureType": "UserError",
"target": "If Condition1",
"details": ""
}
Is it possible to convert this returned value to an integer in order to make the comparison? If not, is there a possible work around in order to achieve my desired result?
Looks like the issue is with your dynamic expression. Please correct your dynamic expression similar to below and retry.
If firstRowOnly is set to true : #greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output.firstRow.propertyname),20000000)
If firstRowOnly is set to false : #greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output.value[zero based index].propertyname),20000000)
The lookup result is returned in the output section of the activity run result.
When firstRowOnly is set to true (default), the output format is as shown in the following code. The lookup result is under a fixed firstRow key. To use the result in subsequent activity, use the pattern of #{activity('MyLookupActivity').output.firstRow.TableName}.
Sample Output JSON code is as follows:
{
"firstRow":
{
"Id": "1",
"TableName" : "Table1"
}
}
When firstRowOnly is set to false, the output format is as shown in the following code. A count field indicates how many records are returned. Detailed values are displayed under a fixed value array. In such a case, the Lookup activity is followed by a Foreach activity. You pass the value array to the ForEach activity items field by using the pattern of #activity('MyLookupActivity').output.value. To access elements in the value array, use the following syntax: #{activity('lookupActivity').output.value[zero based index].propertyname}. An example is #{activity('lookupActivity').output.value[0].tablename}.
Sample Output JSON Code is as follows:
{
"count": "2",
"value": [
{
"Id": "1",
"TableName" : "Table1"
},
{
"Id": "2",
"TableName" : "Table2"
}
]
}
Hope this helps.
Do this - when you run the debugger look at the output from your lookup. It will give a json string including the alias for the result of your query. If it's not firstrow set then you get a table. But for first you'll get output then firstRow and then your alias. So that's what you specify.
For example...if you put alias of your count as Row_Cnt then...
#greater(activity('COUNT_RL_WK_GRBY_LOOKUP').output.firstRow.Row_Cnt,20000000)
You don't need the int function. You were trying to do that (just like I was!) because it was complaining about datatype. That's because you were returning a bunch of json text as the output instead of the value you were after. Totally makes sense after you realize how it works. But it is NOT intuitively obvious because it's coming back with data but its string stuff from json, not the value you're after. And functions like equals are just happy with that. It's not until you try to do something like greater where it looks for numeric value that it chokes.
We configure Grafana to use a table input data source, it works very well with the fields already defined (like time, status, values, etc.).
But now a new field has been added to the table that is a serialized JSON object, returned from a process we can not modify.
We need to use a value (timestamp) that is a property of this serialized object in that table string field.
One serialized field value example is this:
{"timestamp":"2020-02-23T18:25:44.012Z","status":"fail","errors":[{"timestamp":"2020-02-23T18:25:43.511Z","message":"invalid key: key is shorter than minimum 16 bytes"},{"timestamp":"2020-02-23T18:25:43.851Z","message":"unauthorized: authorization not possible"}]}
The pretty print is:
{
"timestamp": "2020-02-23T18:25:44.012Z",
"status": "fail",
"errors": [
{
"timestamp": "2020-02-23T18:25:43.511Z",
"message": "invalid key: key is shorter than minimum 16 bytes"
},
{
"timestamp": "2020-02-23T18:25:43.851Z",
"message": "unauthorized: authorization not possible"
}
]
}
Is there any way to use a value like: field.timestamp or field.errors[0].timestamp ?
Is there a Plugin that allows it ?, or is not possible at all ?
Use PostgreSQL JSON column select in your Grafana query, e.g.:
SELECT
field->'timestamp',
...
I am pretty new to programming, so I count on your kindness.
Is it possible to send other types than Strings in FCM data message payload, like bool, List or even a documentSnapshot?
If you look at the reference documentation for message.data it is defined as:
data
map (key: string, value: string)
Input only. Arbitrary key/value payload. The key should not be a reserved word ("from", "message_type", or any word starting with "google" or "gcm").
An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.
So it is a flat list of key value pairs, where both key and value are strings. If course you can store any data you want in that value, as long as you encode/decode it as a string.
Following data in table dashboard_data. The column name is mr_data.
{"priority_id": "123", "urgent_problem_id": "111", "important_problem_id": "222"}
{"priority_id": "456", "urgent_problem_id": "", "important_problem_id": "333"}
{"priority_id": "789", "urgent_problem_id": "444", "important_problem_id": ""}
Query-
UPDATE
dashboard_data
SET
mr_data = replace(dashboard_data.mr_data,'urgent_problem_id','urgent_problem_ids')
WHERE
mr_data->>'urgent_problem_id' IS NOT NULL;
Expected result:
{"priority_id": "123", "urgent_problem_ids": {"111"}, "important_problem_ids": {"222"}}
{"priority_id": "456", "urgent_problem_ids": {""}, "important_problem_ids": {"333"}}
{"priority_id": "789", "urgent_problem_ids": {"444"}, "important_problem_ids": {""}}
Is there any way that during replace we get {} representation of data as shown in expected result.
Assuming Postgres 9.5 or newer.
You can use jsonb_set to add a proper JSON array with a new key, then remove the old key from the JSON (which is the only way to rename a key)
update dashboard_data
set mr_data = jsonb_set(mr_data, '{urgent_problem_ids}',
jsonb_build_array(mr_data -> 'urgent_problem_id'), true)
- 'urgent_problem_id'
where mr_data ? 'urgent_problem_id';
jsonb_build_array(mr_data -> 'urgent_problem_id') creates a proper JSON array with the (single) value from the urgent_problem_id that value is then stored under the new key urgent_problem_ids and finally the old key urgent_problem_id is removed using the - operator.
Online example: http://rextester.com/POG52716
If your column is not a JSONB (which it should be) then you need to cast the column inside jsonb_set() and cast the result back to a json