Is there any way to create models from MongoDB Database like Loopback has model discovery to reverse engineer database schemas into model definitions.
Because mongoDB database model is flexible it is not possible or not practical to create data model from it. Let me explain it this way. In SQL we create tables and columns manually or programmatically which does not change overtime. On the other hand in mongoDB you do not need to create tables and fields, when you invoke insert function on mongodb the data is stored in document model structure, which can be change over time depending on the data submitted for the next insertion.
Related
We have a database in Postgresql, having candidate,job,campaign related tables with mapping of candidate-->job(job_candidate_mapping let's call it jcm table)and candidate-->campaign(campaign_candidate_mapping let's call it ccm table).
Also we have candidate related tables like candidate_education_details, candidate_company_details, etc.
we want to send the data to elasticsearch related to candidate-job-campaign as one document.
What will be the best way to send relatable data for candidate from multiple tables to ElasticSearch?
We are planning to create a table having all the denormalised data related to that table in a row which we need during search.
Every time we update any candidate related data from above tables, we need to update it on ElasticSearch.
So now we have to maintain this denormalised table and need extra code to update this table, is it the right approach?
What's the standard way to update the search engine, how does big companies do this?
Please help, any suggestions would be appreciated.
Does mongodb have some analogue of SHOW CREATE TABLE in mysql which shows create query for collection?
Or can I create another collection like existing one with all settings?
There no analogs for SHOW CREATE TABLE.
But maybe you will find a some usufull functions there https://docs.mongodb.com/manual/reference/command/nav-administration/
For example the information about indexes you can retrieve with getIndexes function.
Create the indexes you can via createIndexes function.
Example:
var indexes = db.collection.getIndexes();
db.collection.createIndexes(indexes);
Use MongoDB compass : Visualize, understand, and work with your data through an intuitive GUI.
https://www.mongodb.com/products/compass?_bt=208952627176&_bk=mongodb%20compass&_bm=e&_bn=g&utm_source=google&utm_campaign=Americas_US_CorpEntOnly_Brand_Alpha_FM&utm_keyword=mongodb%20compass&utm_device=c&utm_network=g&utm_medium=cpc&utm_creative=208952627176&utm_matchtype=e&_bt=208952627176&_bk=mongodb%20compass&_bm=e&_bn=g&jmp=search&gclid=Cj0KCQiAmITRBRCSARIsAEOZmr6S3Hw_plZO3dbZS7UGwhU2hS-EGz2vB1SR5tAuMOGd-6j82FkQunIaAgDQEALw_wcB
There is no good answer to this question because the schema involved when dealing with schema-less databases like MongoDB is dictated by the application, not the database.
The database will shove in whatever it is given as there is nothing enforcing a consistent document structure within a given collection, even though all access to the database should be controlled through some kind of wrapper. In conclusion, the only place you should look at for the schema is your model classes.
I want to clone an existing collection, including data and indexes, to a new collection with another name within the same database, using mongodb JSON interface (not the command-line interface).
I've tried:
cloneCollection - didn't work. is for cloning across databases.
aggregate with an $out operator - that copies just the data but not the indexes.
The aggregate command I've tried:
{"aggregate":"orig_coll", "pipeline":[{"$out":"orig_clone"}]}
There is no way to do this in one JSON query.
So, two solutions here :
Using mongodump/mongorestore as proposed in What's the fastest way to copy a collection within the same database?
Using two queries : one to create the destination table with the index and the aggregate query that you already have. I understand that it's not a perfect solution as you need to maintain the query to create the index on the destination table and the index on the source table but there's no other way to do this.
What you need to understand is that, the JSON interface as you told it is not a database interface but a database JavaScript query language. So you can pass query to it not command. In fact, it's not an interface just a query DSL. The interface is the mongo shell or any of the mongo drivers (java, perl, ...) or any of the mongo admin tools ...
I can't find any information about connecting to mongodb in the documents of sqlalchemy and google search.
Is it possible to use mongodb with sqlalchemy? Thanks.
as per sql alchem desc you cannot use it:
SQLAlchemy considers the database to be a relational algebra engine,
not just a collection of tables. Rows can be selected from not only
tables but also joins and other select statements; any of these units
can be composed into a larger structure. SQLAlchemy's expression
language builds on this concept from its core.
SQLAlchemy is most famous for its object-relational mapper (ORM), an
optional component that provides the data mapper pattern, where
classes can be mapped to the database in open ended, multiple ways -
allowing the object model and database schema to develop in a cleanly
decoupled way from the beginning.
The main goal of SQLAlchemy is to change the way you think about
databases and SQL!
You may use MongoAlchemy instead.
I have a Strongloop application using MongoDB. I have a model with several sub-models, using relations. When I try and insert an object into the database, only the top level data is added. All the related data is ignored. How do I get an entire object with sub-objects into the database?
See the embedded models documentation: http://docs.strongloop.com/display/public/LB/Embedded+models+and+relations