I have the following schema:
const mySchema = new mongoose.Schema({
x: String,
y: String
})
when a user from the front-end requests in his body:
req.body = {
'x' : '',
'y': ''
}
this results in creating a field in MongoDB, but with an empty string.
I need a way to prevent this behavior by setting the empty strings to be undefined somehow.
Is there such an option in Mongoose? or do I have to predict my own middlewares for that?
You could use the set method for Mongoose Schemas:
const mySchema = new mongoose.Schema(
{
myAttribute: {
type: String,
set: (attribute: string) => attribute === '' ? undefined : attribute,
},
},
{ strict: 'throw' },
);
This will unset the field if the string equals ''.
Use this to trim the strings:
set: (a: string) => a?.trim() === '' ? undefined : a
You don't need mongoose, or a middleware to handle this. You can just write a quick few lines to check for empty values and exclude them from the MongoDB write operation.
Ex:
const newEntry = Object.entries(req.body).reduce((obj, [key, value]) => {
if (value) obj[key] = value
return obj
}, {})
In this example, I convert the req.body into an array using Object.entries and iterate over it with the Array.reduce method, wherein I add key:value pairs to a new object if there is a value to add. Since an empty string value is falsey I can do a simple if check on the value. I then assign the return of the reduce method to the variable newEntry. Then I would then take the new entry and create the MongoDB document with it.
This could be extracted into a helper method and reused in any of your routes that need to check remove empty values from an object.
Docs on Array.reduce
Docs on Object.entries
In my case i have schema as mentioned below. in this schema values key has an object with dynamic key:value pairs. while i am inserting/updating using this schema the sequence of my key value pair are changed to acceding. How can i prevent this.??
const mySchema = new Schema({
formName: String
values: Object
})
example...
my data to insert/update is... and i want to store it in same sequence.
{
formName: "my Form",
values: {
textBox1:"Value1",
dropdown1:"Value2",
textBox2:"Value3",
}
}
but the data is stored in below given sequence..
{
formName: "my Form",
values: {
dropdown1:"Value2",
textBox1:"Value1",
textBox2:"Value3",
}
}
You might need to use an array for that.
Instead of inserting/updating values via Object, it would be better to store key and value in array.
{
formName: "my Form",
keys: ["textBox1","dropdown1","textBox2"],
values: ["value1","value2","value3"]
}
I have a document in Firebase Firestore that is something like the below. The main point here is that I have an array called items with objects inside it:
{
name: 'Foo',
items: [
{
name: 'Bar',
meta: {
image: 'xyz.png',
description: 'hello world'
}
},
{
name: 'Rawr',
meta: {
image: 'abc.png',
description: 'hello tom'
}
}
]
}
I am trying to update a field inside the item array, under the meta object. For example items[0].meta.description from hello world to hello bar
Initially I attempted to do this:
const key = `items.${this.state.index}.meta.description`
const property = `hello bar`;
this.design.update({
[key]: property
})
.then(() => {
console.log("done")
})
.catch(function(error) {
message.error(error.message);
});
This didn't appear to work though, as it removed everything in the item index I wanted to modify, and just kept the description under the meta object
I am now trying the following which basically rewrites the whole meta object with the new data
const key = `items.${this.state.index}.meta`
const property = e.target.value;
let meta = this.state.meta;
meta[e.target.id] = property;
this.design.update({
[key]: meta
})
.then(() => {
this.setState({
[key]: meta
})
})
.catch(function(error) {
message.error(error.message);
});
Unfortunately though, this seems to turn my whole items array into an object that looks something like:
{
name: 'Foo',
items: {
0: {
name: 'Bar',
meta: {
image: 'xyz.png',
description: 'hello world'
}
},
1: {
name: 'Rawr',
meta: {
image: 'abc.png',
description: 'hello tom'
}
}
}
}
Any ideas how I can just update the content I want to?
Firestore doesn't have the ability to update an existing element in an indexed array. Your only array options for updates are described in the documentation - you can add a new element to the array ("arrayUnion") or remove an element ("arrayRemove").
As an alternative, you can read the entire array out of the document, make modifications to it in memory, then update the modified array field entirely.
You can make a separate collection for that particular array, like this in this picture earlier I had different fields (no collections) of name, email and pages, And in this, I wanted to change the data of a specific page that is inside the array. For that, I made a different collection of pages with individual documents of each page having values of title description and content which can be mutated.
This should print out everything in collection
but it only prints out two elements.
Why won't it print out the entire list?
Is it some kind of race case?
http://jsfiddle.net/Czenu/1/
class window.Restful
constructor:->
_.each #collection, (action,kind)=>
$('.actions').append "<div>#{action} #{kind}</div>"
class Material extends Restful
namespace: 'admin/api'
table_name: 'materials'
constructor:(#$rootScope,#$http)->
super
collection:
get: 'downloaded'
get: 'incomplete'
get: 'submitted'
get: 'marked'
get: 'reviewed'
get: 'corrected'
get: 'completed'
post: 'sort'
post: 'sort_recieve'
new Material()
Your collection object consists of elements with just two different keys: "get" and "post". Since each key can only map to one value, your object is reduced to:
collection:
get: 'downloaded'
...
get: 'corrected'
get: 'completed'
post: 'sort'
post: 'sort_recieve'
The solution is to make more senseful objects, for instance an array of custom objects (created using a shortcut function with a senseful name, as in the example below.).
class window.Restful
constructor: ->
_.each #collection, (obj) =>
{action,kind} = obj
$('.actions').append "<div>#{action} #{kind}</div>"
class Material extends Restful
get = (action) -> {action, kind:'get'}
post = (action) -> {action, kind:'post'}
...
collection: [
get 'downloaded'
get 'incomplete'
get 'submitted'
get 'marked'
get 'reviewed'
get 'corrected'
get 'completed'
post 'sort'
post 'sort_recieve'
]
The full result is shown at http://jsfiddle.net/Czenu/2/.
I'm importing from a CSV and getting data roughly in the format
{ 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 }
The names of the fields are dynamic. (Well, they're dynamic in that there might be more than Field1 and Field2, but I know Field1 and Field2 are always going to be there.
I'd like to be able to pass in this dictionary into my class allMyFields so that I can access the above data as properties.
class allMyFields:
# I think I need to include these to allow hinting in Komodo. I think.
self.Field1 = None
self.Field2 = None
def __init__(self,dictionary):
for k,v in dictionary.items():
self.k = v
#of course, this doesn't work. I've ended up doing this instead
#self.data[k] = v
#but it's not the way I want to access the data.
q = { 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 }
instance = allMyFields(q)
# Ideally I could do this.
print q.Field1
Any suggestions? As far as why -- I'd like to be able to take advantage of code hinting, and importing the data into a dictionary called data as I've been doing doesn't afford me any of that.
(Since the variable names aren't resolved till runtime, I'm still going to have to throw a bone to Komodo - I think the self.Field1 = None should be enough.)
So - how do I do what I want? Or am I barking up a poorly designed, non-python tree?
You can use setattr (be careful though: not every string is a valid attribute name!):
>>> class AllMyFields:
... def __init__(self, dictionary):
... for k, v in dictionary.items():
... setattr(self, k, v)
...
>>> o = AllMyFields({'a': 1, 'b': 2})
>>> o.a
1
Edit: let me explain the difference between the above code and SilentGhost's answer. The above code snippet creates a class of which instance attributes are based on a given dictionary. SilentGhost's code creates a class whose class attributes are based on a given dictionary.
Depending on your specific situation either of these solutions may be more suitable. Do you plain to create one or more class instances? If the answer is one, you may as well skip object creation entirely and only construct the type (and thus go with SilentGhost's answer).
>>> q = { 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 }
>>> q = type('allMyFields', (object,), q)
>>> q.Field1
3000
docs for type explain well what's going here (see use as a constructor).
edit: in case you need instance variables, the following also works:
>>> a = q() # first instance
>>> a.Field1
3000
>>> a.Field1 = 1
>>> a.Field1
1
>>> q().Field1 # second instance
3000
You can also use dict.update instead of manually looping over items (and if you're looping, iteritems is better).
class allMyFields(object):
# note: you cannot (and don't have to) use self here
Field1 = None
Field2 = None
def __init__(self, dictionary):
self.__dict__.update(dictionary)
q = { 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 }
instance = allMyFields(q)
print instance.Field1 # => 3000
print instance.Field2 # => 6000
print instance.RandomField # => 5000
You could make a subclass of dict which allows attribute lookup for keys:
class AttributeDict(dict):
def __getattr__(self, name):
return self[name]
q = AttributeDict({ 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 })
print q.Field1
print q.Field2
print q.RandomField
If you try to look up an attribute that dict already has (say keys or get), you'll get that dict class attribute (a method). If the key you ask for doesn't exist on the dict class, then the __getattr__ method will get called and will do your key lookup.
Use setattr for the pretty way. The quick-n-dirty way is to update the instance internal dictionary:
>>> class A(object):
... pass
...
>>> a = A()
>>> a.__dict__.update({"foo": 1, "bar": 2})
>>> a.foo
1
>>> a.bar
2
>>>
Using named tuples (Python 2.6):
>>> from collections import namedtuple
>>> the_dict = {'Field1': 3, 'Field2': 'b', 'foo': 4.9}
>>> fields = ' '.join(the_dict.keys())
>>> AllMyFields = namedtuple('AllMyFields', fields)
>>> instance = AllMyFields(**the_dict)
>>> print instance.Field1, instance.Field2, instance.foo
3 b 4.9
class SomeClass:
def __init__(self,
property1,
property2):
self.property1 = property1
self.property2 = property2
property_dict = {'property1': 'value1',
'property2': 'value2'}
sc = SomeClass(**property_dict)
print(sc.__dict__)
Or you can try this
class AllMyFields:
def __init__(self, field1, field2, random_field):
self.field1 = field1
self.field2 = field2
self.random_field = random_field
#classmethod
def get_instance(cls, d: dict):
return cls(**d)
a = AllMyFields.get_instance({'field1': 3000, 'field2': 6000, 'random_field': 5000})
print(a.field1)
enhanced of sub class of dict
recurrence dict works!
class AttributeDict(dict):
"""https://stackoverflow.com/a/1639632/6494418"""
def __getattr__(self, name):
return self[name] if not isinstance(self[name], dict) \
else AttributeDict(self[name])
if __name__ == '__main__':
d = {"hello": 1, "world": 2, "cat": {"dog": 5}}
d = AttributeDict(d)
print(d.cat)
print(d.cat.dog)
print(d.cat.items())
"""
{'dog': 5}
5
dict_items([('dog', 5)])
"""
If you are open for adding a new library, pydantic is a very efficient solution. It uses python annotation to construct object and validate type Consider the following code:
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: str
data = {"name": "ahmed", "age": 36}
p = Person(**data)
pydantic: https://pydantic-docs.helpmanual.io/
A simple solution is
field_dict = { 'Field1' : 3000, 'Field2' : 6000, 'RandomField' : 5000 }
# Using dataclasses
from dataclasses import make_dataclass
field_obj = make_dataclass("FieldData", list(field_dict.keys()))(*field_dict.values())
# Using attrs
from attrs import make_class
field_obj = make_class("FieldData", list(field_dict.keys()))(*field_dict.values())