How to extend django's default User in mongodb? - mongodb

I'm using mongodb as database and trying to extend the django's inbuilt user model.
here's the error I'm getting:
django.core.exceptions.ValidationError: ['Field "auth.User.id" of model container:"<class \'django.contrib.auth.models.User\'>" cannot be of type "<class \'django.db.models.fields.AutoField\'>"']
Here's my models.py:
from djongo import models
from django.contrib.auth.models import User
class Profile(models.Model):
user = models.EmbeddedField(model_container=User)
mobile = models.PositiveIntegerField()
address = models.CharField(max_length=200)
pincode = models.PositiveIntegerField()

Using EmbeddedField is not a good idea, because it will duplicate user data in the database. You will have some user in the Users collection and the same data will be embedded in the Profile collection elements.
Just keep the user id in the model and query separately:
class Profile(models.Model):
user_id = models.CharField() #or models.TextField()
mobile = models.PositiveIntegerField()
address = models.CharField(max_length=200)
pincode = models.PositiveIntegerField()

It is simple as defined in the documentation.
So, first, use djongo models as the model_container, and I suppose the User model is the Django model, not the djongo model.
And the second thing, make your model_cotainer model abstract by defining in the Meta class as given below.
from djongo import models
class Blog(models.Model):
name = models.CharField(max_length=100)
class Meta:
abstract = True
class Entry(models.Model):
blog = models.EmbeddedField(
model_container=Blog
)
headline = models.CharField(max_length=255)
Ref: https://www.djongomapper.com/get-started/#embeddedfield

Related

Django Rest Framework - Serializer error when trying to add model from json with 1 attribute as array

Im trying to create an API with Django RestFramework to save some info of computers.
I have encountered a problem when the json has an attribute that is an array of IPv4 field.
I generated the following code
Model
class Computer(models.Model):
hostname = models.CharField(max_length=32)
os_system = models.CharField(max_length=60)
class ComputerIPAddress(models.Model):
computer = models.ForeignKey(Computer,on_delete=models.CASCADE)
ip_address = models.GenericIPAddressField()
Serializer
class ComputerIPAddressSerializer(serializers.ModelSerializer):
class Meta:
model = ComputerIPAddress
fields = ('__all__')
class ComputerSerializer(serializers.ModelSerializer):
ip_address = ComputerIPAddressSerializer(many=True)
class Meta:
model = Computer
fields = ('__all__')
Viewset
class ComputerViewSet(viewsets.ModelViewSet):
queryset = Computer.objects.all()
serializer_class = ComputerSerializer
class ComputerIPAddressViewSet(viewsets.ModelViewSet):
queryset = ComputerIPAddress.objects.all()
serializer_class = ComputerIPAddressSerializer
The idea is that the IP belongs to the computer (if the computer is deleted, I am not interested in having the IP) and a computer can have several IP assigned to it.
The json that is sent is the following:
{'hostname':'PC-01','os_system':'Windows 10','ip_address':['192.168.1.10','192.168.2.10']}
I would approach this problem by overriding the create method the ComputerSerializer
class ComputerSerializer(serializer.ModelSerializer):
...
def create(self, validated_data):
ip_address = validated_data.pop("ip_address", None)
computer = Computer.objects.create(**validated_data)
if ip_address:
for ip in ip_address:
computer.computeripaddress_set.create(ip)
return computer

How can I listen for the creation of a specific model and create a new one (on a different table) based on this?

I have a User model with a referral_key attribute. I'd like to create a ReferralKeyRecord upon creation of a user. I've read tons of documentation and StackExchange to no avail.
This answer uses after_insert(), but I am not trying to alter or validate the class which is being inserted; I am trying to add a new object from a completely different model—and session.add() isn't supported.
This answer is closer to what I want, but the accepted answer (ultimately) uses after_flush(), which is far too general. I don't want to listen to events thrown whenever the DB is updated somehow. I want it to fire off when a specific model is created.
And something like the following...
#event.listens_for(User, 'after_flush')
def create_referral_record(mapper, connection, target):
session.add(ReferralRecord(key=instances.referral_key))
session.commit()
... results in No such event 'after_flush' for target '<class 'models.User'>. I've looked through SQLAlchemy's documentation (the core events and the ORM events) and see no events that indicate a specific model has been created. The closest thing in Mapper events is the after_insert method, and the closest thing in Session events is the after_flush() method. I imagine this is a pretty common thing to need to do, and would thus be surprised if there wasn't an easy event to listen to. I assume it'd be something like:
#event.listens_for(User, 'on_creation')
def create_referral_record(session, instance):
record = ReferralRecord(key=instance.referral_key)
session.add(record)
session.commit()
Does anyone know better than I?
Or why not create the Referral inside the User constructor?
from sqlalchemy.orm import Session, relationship, Mapper
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, ForeignKey, create_engine, event
Base = declarative_base()
class User(Base):
__tablename__ = 'user'
def __init__(self):
self.referral = Referral()
id = Column(Integer(), primary_key=True)
referral = relationship('Referral', uselist=False)
class Referral(Base):
__tablename__ = 'referral'
id = Column(Integer(), primary_key=True)
user_id = Column(Integer(), ForeignKey('user.id'), nullable=False)
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = Session(bind=engine)
session.add(User())
session.commit()
print(session.query(User).all())
print(session.query(Referral).all())
You can use the after_flush session event, and inside the event handler you can access the session's new objects (using session.new).
Example:
from sqlalchemy.orm import Session, relationship, Mapper
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, ForeignKey, create_engine, event
Base = declarative_base()
class User(Base):
__tablename__ = 'user'
id = Column(Integer(), primary_key=True)
class Referral(Base):
__tablename__ = 'referral'
id = Column(Integer(), primary_key=True)
user_id = Column(Integer(), ForeignKey('user.id'), nullable=False)
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = Session(bind=engine)
#event.listens_for(session, 'after_flush')
def session_after_flush(session, flush_context):
for obj in session.new:
if isinstance(obj, User):
session.add(Referral(user_id=obj.id))
session.add(User())
session.commit()
print(session.query(User).all())
print(session.query(Referral).all())
Running this outputs:
[<__main__.User object at 0x00000203ABDF5400>]
[<__main__.Referral object at 0x00000203ABDF5710>]

Graphene-Python documentation on clients view

The description property on GraphQL's schema elements can be viewed by the client. For example, GraphQL shows the description value for a field object in the type-ahead dropdown that lists fields available inside a selection set. This same description appears on the documentation section. Can this type of metadata documentation be added through graphene-gae? My set up:
models.py:
class Article(ndb.Model):
headline = ndb.StringProperty()
author_key = ndb.KeyProperty(kind='Author')
created_at = ndb.DateTimeProperty(auto_now_add=True)
import graphene
from graphene_gae import NdbObjectType
Schema.py:
class ArticleType(NdbObjectType):
class Meta:
model = Article
class Query(graphene.ObjectType):
articles = graphene.List(ArticleType)
#graphene.resolve_only_args
def resolve_articles(self):
return Article.query()
schema = graphene.Schema(query=QueryRoot)
I can add descriptions like this:
headline = ndb.StringProperty(description='Add description here!')
Super Easy!

Using restframework mongoengine how to create a queires?

models.py
from mongoengine import Document, fields
class Tool(Document):
Fruit = fields.StringField(required=True)
District = fields.StringField(required=True)
Area = fields.StringField(required=True)
Farmer = fields.StringField(required=True)
Serializers.py file
from rest_framework import serializers
from rest_framework_mongoengine.serializers import DocumentSerializer
from models import Tool
class ToolSerializer(DocumentSerializer):
class Meta:
model = Tool
views.py file
from django.template.response import TemplateResponse
from rest_framework_mongoengine.viewsets import ModelViewSet as MongoModelViewSet
from app.serializers import *
def index_view(request):
context = {}
return TemplateResponse(request, 'index.html', context)
class ToolViewSet(MongoModelViewSet):
lookup_field = 'Fruit'
serializer_class = ToolSerializer
def get_queryset(self):
return Tool.objects.all()
So,I want to create queries like http://127.0.0.1:8000/api/tool/?Fruit=Banana gives me all data for fruit banana only. Also, http://127.0.0.1:8000/api/tool/?District=Pune gives me data for Pune district only .
Unfortunately, I haven't tried this solution myself yet, but AFAIK, in pure DRF with SQL database you'd use Django-Filters package for this.
There's an analogue of it for DRF-ME, called drf-mongo-filters, written by Maxim Vasiliev, co-author of DRF-ME. It contains a decent set of tests, you could use for inspiration.
Basically, you say something like:
from rest_framework.test import APIRequestFactory
from rest_framework.generics import ListAPIView
from mongoengine import Document, fields
from drf_mongo_filters.filtersets import filters, Filterset, ModelFilterset
from drf_mongo_filters.backend import MongoFilterBackend
class TestFilter(Filterset):
foo = filters.CharFilter()
class TestView(ListAPIView):
filter_backends = (MongoFilterBackend,)
filter_class = TestFilter
serializer_class = mock.Mock()
queryset = mock.Mock()
TestView.as_view()(APIRequestFactory().get("/?foo=Foo"))
TestView.queryset.filter.assert_called_once_with(foo="Foo")
Haven't tried doing the same with ViewSets, but as they inherit from GenericView, I guess, they should respect filter_class and filter_backends parameters, too.

EmbeddedDocumentSerializer runs query for every ReferenceField

I have following models and serializer the target is when serializer runs to have only one query:
Models:
class Assignee(EmbeddedDocument):
id = ObjectIdField(primary_key=True)
assignee_email = EmailField(required=True)
assignee_first_name = StringField(required=True)
assignee_last_name = StringField()
assignee_time = DateTimeField(required=True, default=datetime.datetime.utcnow)
user = ReferenceField('MongoUser', required=True)
user_id = ObjectIdField(required=True)
class MongoUser(Document):
email = EmailField(required=True, unique=True)
password = StringField(required=True)
first_name = StringField(required=True)
last_name = StringField()
assignees= EmbeddedDocumentListField(Assignee)
Serializers:
class MongoUserSerializer(DocumentSerializer):
assignees = AssigneeSerializer(many=True)
class Meta:
model = MongoUser
fields = ('id', 'email', 'first_name', 'last_name', 'assignees')
depth = 2
class AssigneeSerializer(EmbeddedDocumentSerializer):
class Meta:
model = Assignee
fields = ('assignee_first_name', 'assignee_last_name', 'user')
depth = 0
When checking the mongo profiler I have 2 queries for the MongoUser Document. If I remove the assignees field from the MongoUserSerializer then there is only one query.
As a workaround I've tried to use user_id field to store only ObjectId and changed AssigneeSerializer to:
class AssigneeSerializer(EmbeddedDocumentSerializer):
class Meta:
model = Assignee
fields = ('assignee_first_name', 'assignee_last_name', 'user_id')
depth = 0
But again there are 2 queries. I think that the serializer EmbeddedDocumentSerializer fetches all the fields and queries for ReferenceField and
fields = ('assignee_first_name', 'assignee_last_name', 'user_id')
works after the queries are made.
How to use ReferenceField and not run a separate query for each reference when serializing?
I ended up with a workaround and not using ReferenceField. Instead I am using ObjectIdField:
#user = ReferenceField("MongoUser", required=True) # Removed now
user = ObjectIdField(required=True)
And changed value assignment as follows:
- if assignee.user == MongoUser:
+ if assignee.user == MongoUser.id:
It is not the best way - we are not using ReferenceField functionality but it is better than creating 30 queries in the serializer.
Best Regards,
Kristian
It's a very interesting question and I think it is related to Mongoengine's DeReference policy: https://github.com/MongoEngine/mongoengine/blob/master/mongoengine/dereference.py.
Namely, your mongoengine Documents have a method MongoUser.objects.select_related() with max_depth argument that should be large enough that Mongoengine traversed 3 levels of depth: MongoUser->assignees->Assignee->user and cached all the related MongoUser objects for current MongoUser instance. Probably, we should call this method somewhere in our DocumentSerializers in DRF-Mongoengine to prefetch the relations, but currently we don't.
See this post about classical DRF + Django ORM that explains, how to fight N+1 requests problem by doing prefetching in classical DRF. Basically, you need to override the get_queryset() method of your ModelViewSet to use select_related() method:
from rest_framework_mongoengine.viewsets import ModelViewSet
class MongoUserViewSet(ModelViewSet):
def get_queryset(self):
queryset = MongoUser.objects.all()
# Set up eager loading to avoid N+1 selects
queryset.select_related(max_depth=3)
return queryset
Unfortunately, I don't think that current implementation of ReferenceField in DRF-Mongoengine is smart enough to handle these querysets appropriately. May be ComboReferenceField will work?
Still, I've never used this feature yet and didn't have enough time to play with these settings myself, so I'd be grateful to you, if you shared your findings.