I have a lookup activity (Get_ID) that returns:
{
"count": 2,
"value": [
{
"TRGT_VAL": "10000"
},
{
"TRGT_VAL": "52000"
}
],
(...)
I want to use these 2 values from TRGT_VAL in a WHERE clause of a query in another activity. I'm using
#concat('SELECT * FROM table WHERE column in ',activity('Get_ID').output.value[0].TRGT_VAL)
But only the first value of 10000 is being taken into account. How to get the whole list?
I solved by using a lot of replaces:
#concat('(',replace(replace(replace(replace(replace(replace(replace(string(activity('Get_ID').output.value),'{',''),' ',''),'"',''),'TRGT_VAL:',''),'[',''),'}',''),']',''),')')
Output
{
"name": "AptitudeCF",
"value": "(10000,52000)"
}
Instead of using big expression with lot of replace functions, you can use String interpolation syntax and frame your query. Below is query which you can consider.
SELECT * FROM table WHERE column in (#{activity('Get_ID').output.value[0].TRGT_VAL},#{activity('Get_ID').output.value[1].TRGT_VAL)
It seems mongo does not allow insertion of keys with a dot (.) or dollar sign ($) however when I imported a JSON file that contained a dot in it using the mongoimport tool it worked fine. The driver is complaining about trying to insert that element.
This is what the document looks like in the database:
{
"_id": {
"$oid": "..."
},
"make": "saab",
"models": {
"9.7x": [
2007,
2008,
2009,
2010
]
}
}
Am I doing this all wrong and should not be using hash maps like that with external data (i.e. the models) or can I escape the dot somehow? Maybe I am thinking too much Javascript-like.
MongoDB doesn't support keys with a dot in them so you're going to have to preprocess your JSON file to remove/replace them before importing it or you'll be setting yourself up for all sorts of problems.
There isn't a standard workaround to this issue, the best approach is too dependent upon the specifics of the situation. But I'd avoid any key encoder/decoder approach if possible as you'll continue to pay the inconvenience of that in perpetuity, where a JSON restructure would presumably be a one-time cost.
As mentioned in other answers MongoDB does not allow $ or . characters as map keys due to restrictions on field names. However, as mentioned in Dollar Sign Operator Escaping this restriction does not prevent you from inserting documents with such keys, it just prevents you from updating or querying them.
The problem of simply replacing . with [dot] or U+FF0E (as mentioned elsewhere on this page) is, what happens when the user legitimately wants to store the key [dot] or U+FF0E?
An approach that Fantom's afMorphia driver takes, is to use unicode escape sequences similar to that of Java, but ensuring the escape character is escaped first. In essence, the following string replacements are made (*):
\ --> \\
$ --> \u0024
. --> \u002e
A reverse replacement is made when map keys are subsequently read from MongoDB.
Or in Fantom code:
Str encodeKey(Str key) {
return key.replace("\\", "\\\\").replace("\$", "\\u0024").replace(".", "\\u002e")
}
Str decodeKey(Str key) {
return key.replace("\\u002e", ".").replace("\\u0024", "\$").replace("\\\\", "\\")
}
The only time a user needs to be aware of such conversions is when constructing queries for such keys.
Given it is common to store dotted.property.names in databases for configuration purposes I believe this approach is preferable to simply banning all such map keys.
(*) afMorphia actually performs full / proper unicode escaping rules as mentioned in Unicode escape syntax in Java but the described replacement sequence works just as well.
The latest stable version (v3.6.1) of the MongoDB does support dots (.) in the keys or field names now.
Field names can contain dots (.) and dollar ($) characters now
The Mongo docs suggest replacing illegal characters such as $ and . with their unicode equivalents.
In these situations, keys will need to substitute the reserved $ and . characters. Any character is sufficient, but consider using the Unicode full width equivalents: U+FF04 (i.e. “$”) and U+FF0E (i.e. “.”).
A solution I just implemented that I'm really happy with involves splitting the key name and value into two separate fields. This way, I can keep the characters exactly the same, and not worry about any of those parsing nightmares. The doc would look like:
{
...
keyName: "domain.com",
keyValue: "unregistered",
...
}
You can still query this easy enough, just by doing a find on the fields keyName and keyValue.
So instead of:
db.collection.find({"domain.com":"unregistered"})
which wouldn't actually work as expected, you would run:
db.collection.find({keyName:"domain.com", keyValue:"unregistered"})
and it will return the expected document.
You can try using a hash in the key instead of the value, and then store that value in the JSON value.
var crypto = require("crypto");
function md5(value) {
return crypto.createHash('md5').update( String(value) ).digest('hex');
}
var data = {
"_id": {
"$oid": "..."
},
"make": "saab",
"models": {}
}
var version = "9.7x";
data.models[ md5(version) ] = {
"version": version,
"years" : [
2007,
2008,
2009,
2010
]
}
You would then access the models using the hash later.
var version = "9.7x";
collection.find( { _id : ...}, function(e, data ) {
var models = data.models[ md5(version) ];
}
It is supported now
MongoDb 3.6 onwards supports both dots and dollar in field names.
See below JIRA: https://jira.mongodb.org/browse/JAVA-2810
Upgrading your Mongodb to 3.6+ sounds like the best way to go.
You'll need to escape the keys. Since it seems most people don't know how to properly escape strings, here's the steps:
choose an escape character (best to choose a character that's rarely used). Eg. '~'
To escape, first replace all instances of the escape character with some sequence prepended with your escape character (eg '~' -> '~t'), then replace whatever character or sequence you need to escape with some sequence prepended with your escape character. Eg. '.' -> '~p'
To unescape, first remove the escape sequence from all instance of your second escape sequence (eg '~p' -> '.'), then transform your escape character sequence to a single escape character(eg '~s' -> '~')
Also, remember that mongo also doesn't allow keys to start with '$', so you have to do something similar there
Here's some code that does it:
// returns an escaped mongo key
exports.escape = function(key) {
return key.replace(/~/g, '~s')
.replace(/\./g, '~p')
.replace(/^\$/g, '~d')
}
// returns an unescaped mongo key
exports.unescape = function(escapedKey) {
return escapedKey.replace(/^~d/g, '$')
.replace(/~p/g, '.')
.replace(/~s/g, '~')
}
From the MongoDB docs "the '.' character must not appear anywhere in the key name". It looks like you'll have to come up with an encoding scheme or do without.
A late answer, but if you use Spring and Mongo, Spring can manage the conversion for you with MappingMongoConverter. It's the solution by JohnnyHK but handled by Spring.
#Autowired
private MappingMongoConverter converter;
#PostConstruct
public void configureMongo() {
converter.setMapKeyDotReplacement("xxx");
}
If your stored Json is :
{ "axxxb" : "value" }
Through Spring (MongoClient) it will be read as :
{ "a.b" : "value" }
As another user mentioned, encoding/decoding this can become problematic in the future, so it's probably just easier to replace all keys that have a dot. Here's a recursive function I made to replace keys with '.' occurrences:
def mongo_jsonify(dictionary):
new_dict = {}
if type(dictionary) is dict:
for k, v in dictionary.items():
new_k = k.replace('.', '-')
if type(v) is dict:
new_dict[new_k] = mongo_jsonify(v)
elif type(v) is list:
new_dict[new_k] = [mongo_jsonify(i) for i in v]
else:
new_dict[new_k] = dictionary[k]
return new_dict
else:
return dictionary
if __name__ == '__main__':
with open('path_to_json', "r") as input_file:
d = json.load(input_file)
d = mongo_jsonify(d)
pprint(d)
You can modify this code to replace '$' too, as that is another character that mongo won't allow in a key.
I use the following escaping in JavaScript for each object key:
key.replace(/\\/g, '\\\\').replace(/^\$/, '\\$').replace(/\./g, '\\_')
What I like about it is that it replaces only $ at the beginning, and it does not use unicode characters which can be tricky to use in the console. _ is to me much more readable than an unicode character. It also does not replace one set of special characters ($, .) with another (unicode). But properly escapes with traditional \.
Not perfect, but will work in most situations: replace the prohibited characters by something else. Since it's in keys, these new chars should be fairly rare.
/** This will replace \ with ⍀, ^$ with '₴' and dots with ⋅ to make the object compatible for mongoDB insert.
Caveats:
1. If you have any of ⍀, ₴ or ⋅ in your original documents, they will be converted to \$.upon decoding.
2. Recursive structures are always an issue. A cheap way to prevent a stack overflow is by limiting the number of levels. The default max level is 10.
*/
encodeMongoObj = function(o, level = 10) {
var build = {}, key, newKey, value
//if (typeof level === "undefined") level = 20 // default level if not provided
for (key in o) {
value = o[key]
if (typeof value === "object") value = (level > 0) ? encodeMongoObj(value, level - 1) : null // If this is an object, recurse if we can
newKey = key.replace(/\\/g, '⍀').replace(/^\$/, '₴').replace(/\./g, '⋅') // replace special chars prohibited in mongo keys
build[newKey] = value
}
return build
}
/** This will decode an object encoded with the above function. We assume the structure is not recursive since it should come from Mongodb */
decodeMongoObj = function(o) {
var build = {}, key, newKey, value
for (key in o) {
value = o[key]
if (typeof value === "object") value = decodeMongoObj(value) // If this is an object, recurse
newKey = key.replace(/⍀/g, '\\').replace(/^₴/, '$').replace(/⋅/g, '.') // replace special chars prohibited in mongo keys
build[newKey] = value
}
return build
}
Here is a test:
var nastyObj = {
"sub.obj" : {"$dollar\\backslash": "$\\.end$"}
}
nastyObj["$you.must.be.kidding"] = nastyObj // make it recursive
var encoded = encodeMongoObj(nastyObj, 1)
console.log(encoded)
console.log( decodeMongoObj( encoded) )
and the results - note that the values are not modified:
{
sub⋅obj: {
₴dollar⍀backslash: "$\\.end$"
},
₴you⋅must⋅be⋅kidding: {
sub⋅obj: null,
₴you⋅must⋅be⋅kidding: null
}
}
[12:02:47.691] {
"sub.obj": {
$dollar\\backslash: "$\\.end$"
},
"$you.must.be.kidding": {
"sub.obj": {},
"$you.must.be.kidding": {}
}
}
There is some ugly way to query it not recommended to use it in application rather than for debug purposes (works only on embedded objects):
db.getCollection('mycollection').aggregate([
{$match: {mymapfield: {$type: "object" }}}, //filter objects with right field type
{$project: {mymapfield: { $objectToArray: "$mymapfield" }}}, //"unwind" map to array of {k: key, v: value} objects
{$match: {mymapfield: {k: "my.key.with.dot", v: "myvalue"}}} //query
])
For PHP I substitute the HTML value for the period. That's ".".
It stores in MongoDB like this:
"validations" : {
"4e25adbb1b0a55400e030000" : {
"associate" : "true"
},
"4e25adb11b0a55400e010000" : {
"associate" : "true"
}
}
and the PHP code...
$entry = array('associate' => $associate);
$data = array( '$set' => array( 'validations.' . str_replace(".", `"."`, $validation) => $entry ));
$newstatus = $collection->update($key, $data, $options);
Lodash pairs will allow you to change
{ 'connect.sid': 's:hyeIzKRdD9aucCc5NceYw5zhHN5vpFOp.0OUaA6' }
into
[ [ 'connect.sid',
's:hyeIzKRdD9aucCc5NceYw5zhHN5vpFOp.0OUaA6' ] ]
using
var newObj = _.pairs(oldObj);
You can store it as it is and convert to pretty after
I wrote this example on Livescript. You can use livescript.net website to eval it
test =
field:
field1: 1
field2: 2
field3: 5
nested:
more: 1
moresdafasdf: 23423
field3: 3
get-plain = (json, parent)->
| typeof! json is \Object => json |> obj-to-pairs |> map -> get-plain it.1, [parent,it.0].filter(-> it?).join(\.)
| _ => key: parent, value: json
test |> get-plain |> flatten |> map (-> [it.key, it.value]) |> pairs-to-obj
It will produce
{"field.field1":1,
"field.field2":2,
"field.field3":5,
"field.nested.more":1,
"field.nested.moresdafasdf":23423,
"field3":3}
Give you my tip: You can using JSON.stringify to save Object/ Array contains the key name has dots, then parse string to Object with JSON.parse to process when get data from database
Another workaround:
Restructure your schema like:
key : {
"keyName": "a.b"
"value": [Array]
}
Latest MongoDB does support keys with a dot, but java MongoDB-driver is not supporting. So to make it work in Java, I pulled code from github repo of java-mongo-driver and made changes accordingly in their isValid Key function, created new jar out of it, using it now.
Replace the dot(.) or dollar($) with other characters that will never used in the real document. And restore the dot(.) or dollar($) when retrieving the document. The strategy won't influence the data that user read.
You can select the character from all characters.
The strange this is, using mongojs, I can create a document with a dot if I set the _id myself, however I cannot create a document when the _id is generated:
Does work:
db.testcollection.save({"_id": "testdocument", "dot.ted.": "value"}, (err, res) => {
console.log(err, res);
});
Does not work:
db.testcollection.save({"dot.ted": "value"}, (err, res) => {
console.log(err, res);
});
I first thought dat updating a document with a dot key also worked, but its identifying the dot as a subkey!
Seeing how mongojs handles the dot (subkey), I'm going to make sure my keys don't contain a dot.
Like what #JohnnyHK has mentioned, do remove punctuations or '.' from your keys because it will create much larger problems when your data starts to accumulate into a larger dataset. This will cause problems especially when you call aggregate operators like $merge which requires accessing and comparing keys which will throw an error. I have learnt it the hard way please don't repeat for those who are starting out.
In our case the properties with the period is never queried by users directly. However, they can be created by users.
So we serialize our entire model first and string replace all instances of the specific fields. Our period fields can show up in many location and it is not predictable what the structure of the data is.
var dataJson = serialize(dataObj);
foreach(pf in periodFields)
{
var encodedPF = pf.replace(".", "ENCODE_DOT");
dataJson.replace(pf, encodedPF);
}
Then later after our data is flattened we replace instances of the encodedPF so we can write the decoded version in our files
Nobody will ever need a field named ENCODE_DOT so it will not be an issue in our case.
The result is the following
color.one will be in the database as colorENCODE_DOTone
When we write our files we replace ENCODE_DOT with .
/home/user/anaconda3/lib/python3.6/site-packages/pymongo/collection.py
Found it in error messages. If you use anaconda (find the correspondent file if not), simply change the value from check_keys = True to False in the file stated above. That'll work!