I have the following String var input:
val json= """[{"first": 1, "name": "abc", "timestamp": "2018/06/28"},
{"first": 2, "name": "mtm", "timestamp": "2018/06/28"}]"""
I need to remove key value(timestamp)
expected output:
val result= "[{"first": 1, "name": "abc"},{"first": 2, "name": "mtm"}]"
please kindly help.
A simple regex will do it:
json.replaceAll(""",\s*"timestamp"[^,}]*""", "")
Or with a JSON parser, (though it's quite hard to answer w/o knowing what JSON parser you're using), perhaps
parse it, with one of these What JSON library to use in Scala?
then remove the "timestamp" entries with e.g. List.map(m => m - "timestamp") (depends on which library you're using)
recompile the JSON
Related
I have the following message structure:
{
"payload" {
"a": 1,
"b": {"X": 1, "Y":2}
},
"timestamp": 1659692671
}
I want to use SMTs to get the following structure:
{
"a": 1,
"b": {"X": 1, "Y":2},
"timestamp": 1659692671
}
When I use ExtractField$Value for payload, I cannot preserve timestamp.
I cannot use Flatten because "b"'s structure should not be flatten.
Any ideas?
Thanks
Unfortunately, moving a field into another Struct isn't possible with the built-in transforms. You'd have to write your own, or use a stream-processing library before the data reaches the connector.
Note that Kafka records themselves have a timestamp, so do you really need the timestamp as a field in the value? One option is to extract the timestamp, then flatten, then add it back. However, this will override the timestamp that the producer had set with that value.
I have used mongo import to import data into mongodb from csv files. I am trying to retrieve data from an Mongodb realm service. The returned data for the entry is as follows:
{
"_id": "6124edd04543fb222e",
"Field1": "some string",
"Field2": {
"$numberDouble": "145.81"
},
"Field3": {
"$numberInt": "0"
},
"Field4": {
"$numberInt": "15"
},
"Field5": {
"$numberInt": "0"
}
How do I convert this into normal JSON by removing $numberInt and $numberDouble like :
{
"_id": "6124edd04543fb222e",
"Field1": "some string",
"Field2": 145.8,
"Field3": 0,
"Field4": 15,
"Field5": 0
}
The fields are also different for different documents so cannot use Mongoose directly. Are there any solutions to this?
Also would help to know why the numbers are being stored as $numberInt:"".
Edit:
For anyone with the same problem this is how I solved it.
The array of documents is in EJSON format instead of JSON like said in the upvoted answer. To covert it back into normal JSON, I used JSON.stringify to first convert each document I got from map function into string and then parsed it using EJSON.parse with
{strict:false} (this option is important)
option to convert it into normal JSON.
{restaurants.map((restaurant) => {
restaurant=EJSON.parse(JSON.stringify(restaurant),{strict:false});
}
EJSON.parse documentation here. The module to be installed and imported is mongodb-extjson.
The format with $numberInt etc. is called (MongoDB) Extended JSON.
You are getting it on the output side either because this is how you inserted your data (meaning your inserted data was incorrect, you need to fix the ingestion side) or because you requested extended JSON serialization.
If the data in the database is correct, and you want non-extended JSON output, you generally need to write your own serializers to JSON since there are multiple possibilities of how to format the data. MongoDB's JSON output format is the Extended JSON you're seeing in your first quote.
I have a json response with a list of strings. I want to check if it contains the same elements as some other list (order of those elements isn't important. How do I do that?
I tried this:
jsonPath("$.country_codes[*]").findAll.sorted.is(List("DE", "CH", "FR", "IE", "IT", "NL", "RS", "UK", "IN").sorted)
but I'm getting error "Cannot resolve symbol sorted". If I don't use 'sorted', it works, but I can't rely on getting the same order of elements from server each time..
Use transform to turn your Seq[String] into a Set[String] or sort it.
I have source which is JSON array, sink is SQL server. When I use column mapping and see the code I can see mapping is done to first element of array so each run produces single record despite the fact that source has multiple records. How do I use copy activity to import ALL the rows?
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"schemaMapping": {
"['#odata.context']": "BuyerFinancing",
"['#odata.nextLink']": "PropertyCondition",
"value[0].AssociationFee": "AssociationFee",
"value[0].AssociationFeeFrequency": "AssociationFeeFrequency",
"value[0].AssociationName": "AssociationName",
Use * as the source field to indicate all elements in json format. For example, with json:
{
"results": [
{"field1": "valuea", "field2": "valueb"},
{"field1": "valuex", "field2": "valuey"}
]
}
and a database table with a column result to store the json. The mapping with results as the collection and * and the sub element will create two records with:
{"field1": "valuea", "field2": "valueb"}
{"field1": "valuex", "field2": "valuey"}
in the result field.
Copy Data Field Mapping
ADF support cross apply for json array. Please check the example in this doc. https://learn.microsoft.com/en-us/azure/data-factory/supported-file-formats-and-compression-codecs#jsonformat-example
For schema mapping: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping#schema-mapping
I use RestHeart to get data from MongoDB. When I request a document from a collection that contains a field of type int64 (in this example "INT64_NUMBER") the response contains:
"FLAG_A": "Y",
"FLAG_B": "N",
"INT64_NUMBER" {
"$numberLong": "34"
},
"NUM_D": 123
Is there any option to obtain the same information without the type "$numberLong"? I mean, something like the following:
"FLAG_A": "Y",
"FLAG_B": "N",
"INT64_NUMBER": "34",
"NUM_D": 123
I thought that I should use some kind of aggregation with a project but I can't find a solution by myself neither an example on the web. Does anybody can guide me to find a proper solution? Thanks in advance.
No, you can't. That's the strict json representation of bson https://docs.mongodb.com/manual/reference/mongodb-extended-json/