Create data for testing MongoDB and Postgresql [closed] - mongodb

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I need to test the performance of MongoDB and Postgresql in large amount of data, over 5GB, for a college's assignment.
How can I create data for both databases?
Thanks
EDIT:
I found this webpage http://www.generatedata.com where you can download a script to generate the data

First, take a look to How to Generate Test Data on MongoDB. For MongoDB mongoperf i a tool to measure performance of such database on disk. also, you can see MongoDB Benchmarks. For Postgressql you can use pgbench.

Related

DataBase NoSQL (MongoDB) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Hi I have the following problem: some of you have some NoSQL Data Base crowded, I've searched but still can not find some JSON to import, they are very grateful if anyone provide me with some database to test with MongoDB
Mongo provide a sample dataset which can be found here. In addition there is documentation on generating your own test data.

scaling a database on cloud and on local servers [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am considering using mongo db (it could be postgresql or any other ) as a data warehouse, my concern is that up to twenty or more users could be running queries at a time and this could have serious implications in terms of performance.
My question is what is the best approach to handle this in a cloud based and non cloud based environment? Do cloud based db's automatically handle this? If so would the data be consistent through all instances if a refresh on the data was made? In a non cloud based environment would the best approach be to load balance all instances? Again how would you ensure data integrity for all instances?
thanks in advance
I think auto sharding is what I am looking for
http://docs.mongodb.org/v2.6/MongoDB-sharding-guide.pdf

What are Nosql database and how they are different from R-databases [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What are nosql databases?
Nosql database is a buzzword now a days, I searched there are lots of nosql database like: mongodb, hadoop, cassandra...
But still I dont understand how they are different from Relational databases.
nosql databases are key,value storage that provides functionalities/features like replication,fault tolerance,eventual consistency.Relational database is transactional consistent and with primary,secondary indexes,well designed cost/rule based optimizers.
However,nosql scales horizontally and it is hard to scale relational database.

Mongo multiples databases vs one big collection [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Is it better to create one Mongo databases per each user with thousand documents in one collection or one database with millions of documents in one collection?
I do not want to exchange data through the users. It will be completely separate.
The most powerful structure to store the information depend by query that you want to perform.
For example, if you want to execute a query between more users should consider the hypothesis of a single private collection.
Mongo db also offers an efficient system for the distribution of the workload across multiple machines in a completely transparent see here

Can pgadmin manage large objects? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Is pgadmin aware of PostgreSQL large objects?
pgAdmin does not appear to have any special large object support. You can however use the standard sql language functions available on any client interface to manage them. See http://dave.webdev.pgadmin.org/docs/1.4/pg/largeobjects.html as a reference of these functions.
This leads to a number of important limitations. You would get an escaped string when you pull a large object from the database. You cannot, say, pull a file and display it as an image or the like.