Export Mongodb Charts - mongodb

I successfully installed mongodb-charts and was able to create a dashboard also.
Now I want to save/export this dashboard to JSON(or any other format). Is there a feature to save/export and load/import Mongodb charts ? This would be useful if I want the same dashboard on some other server.
Also There was no tag for mongodb-charts. So any one with tag creation privilege can please create the tag.

MongoDB Charts is in beta release only.
It is designed for MongoDB Atlas and according to the official page "Share dashboards and collaborate" you can share dashboards and charts by adding new users and giving them permissions on the dashboard view "Access" button.
In Access view you can make your dashboard fully public by selecting "Everyone" option and choose permissions rights or just share with specific users.
As a hack, if you want to convert your dashboard into JSON format and transfer from one instance of MongoDB Charts to another, you can try mongodump "metadata" database in your MongoDB instance connected to MongoDB Charts.
It has 4 collections:
dashboards
datasources
items
users
But all relationships are made through GUID ids, so without manual editing you can easily corrupt data during mongorestore.
UPDATE:
The following bash script shows how to export dashboard and chart for migrating data to different MongoDB Charts instance:
# Your Dashboard and Chart names you want to export
dashboard_name="My Dashboard"
chart_name="My Chart"
# Exporting to `tmp` folder data from dashboard collection
mongodump --db metadata --collection dashboards --query "{"title": '$dashboard_name'}" --out "/tmp/"
dashboard_id=$(mongo --quiet --eval "db.getSiblingDB('metadata').dashboards.findOne({'title': '$dashboard_name'}, {'id': 1})['_id']")
# Exporting to `tmp` folder data from items collection
mongodump --db metadata --collection items --query "{\$and: [{'title': '$chart_name'}, {'dashboardId': '$dashboard_id'}]}" --out "/tmp/"
# After the following data is restored to different MongoDB Charts instance
# you need to make sure to modify the following records in imported data
# according to your datasource in new MongoDB Charts instance.
# for dashboards collection modify GUID for the following fields according to new datasource:
mongo --quiet --eval "db.getSiblingDB('metadata').dashboards.findOne({'title': '$dashboard_name'}, {'owners': 1, 'tenantId': 1, 'itemSummary.dataSourceId': 1})"
# for items collection modify GUID for the following fields according to new datasource:
mongo --quiet --eval "db.getSiblingDB('metadata').items.findOne({\$and: [{'title': '$chart_name'}, {'dashboardId': '$dashboard_id'}]}, {'dataSourceId': 1, 'tenantId': 1})"
Remember, this approach is not official and it is possible to corrupt your data.

You could use charts for Trello which works the same way as mongoDB Charts. It allows you to connect to your mongoDB database or to any other system, make your charts and export them as JSON, CSV...

Related

MongoDB- backing up and restoring users and roles

What are best practices for synching users and roles between Mongo instances?
On the same Windows machine, I am trying to copy MongoDB users and roles in the admin database from one Mongo instance to another. Authentication is 'on' for each instance. No combination of mongodump\mongorestore or mongoexport\mongoimport I have tried works. With mongodump\restore, the restore step displays:
assuming users in the dump directory are from <= 2.4 (auth version 1)
Failed: the users and roles collections in the dump have an incompatible auth version with target server: cannot restore users of auth version 1 to a server of auth version 5
I found no command line option to tell it not to do this silly thing. I have Mongo version 4 and that's it installed.
You would think --dumpDbUsersAndRoles and --restoreDbUsersAndRoles would
be symmetrical, but they are not.
I was able to run this,
mongoexport -p 27017 -u admin --password please -d admin --collection system.roles --out myRoles.json
However, when trying mongoimport
mongoimport -p 26017 -u admin --password please -d admin --collection "system.roles" --file myRoles.json
the output displays
error validating settings: invalid collection name: collection name 'system.roles' is not allowed to begin with 'system.'
Primer
Users are attached to databases. Ideally, you have your database specific users stored in the respective database. All “global” users should go into admin. The good part: replica sets take care of syncing those users to each member of the replica set.
Solution
That being said, it seems to be quite obvious on how to deal with this.
For a worst case scenario, it is much easier to have a .js ready which simply recreates the 3-4 global roles instead
of fiddling with system.* collections in the admin database. This has the advantage that you can also do other setup stuff automatically, like sharding setup if TSHTF and you need to rebuild your cluster from scratch.
use admin;
db.createRole([...])
db.createRole([...])
// do other stuff, like sharding setup
Run it against the primary of your replica set or a mongos instance (if you have a sharded cluster) using
mongo daHost:27017/admin myjsfile.js
after you set up your machines but before you enable authentication.
Another option would be to use Ansible for user creation.
As for dump and restore, you might want to leave out the collection name.

MongoDB migration

Hello I have an ubuntu 14.04 server that is running mongodb 2.4.14. I need to move the mongo instance to a new server. I have installed mongo 3.4.2 on the new server and need to move the databases over. I am pretty new with mongo. I have 2 databases that are pretty big but when I do a mongo dump the file is nowhere near the site of the databases that mongo is showing.I cannot figure out how to get mongoexport to work. What would be the best way to move those databases? If possible can we just export the data from mongo and then import it?
You'll need to give more information on your issue with mongodump and what mongodump parameters you were using.
Since you are doing a migration, you'll want to use mongodump and not mongoexport. mongoexport only outputs a JSON/CSV format of a collection. Because of this, mongoexport cannot retain certain datatypes that exist in BSON and thus MongoDB does not suggest that anyone uses mongoexport for full backups; this consideration is listed on mongo's site.
mongodump will be able to accurately create a backup of your database/collection that mongorestore will be able to restore that dump to your new server.
If you haven't already, check out Back Up and Restore with MongoDB Tools

Export number of Documents from mongodb

I use mongochef as a UI client for my mongo database. Now I have collection which consists of 12,000 records. I want to export them using the mongochef.
I have tried with the export option(available) which is working fine up to 3000 documents. But if the number of records gets increasing the system is getting hung up.
Can you please let me know the best way to export all the documents in a nice way using mongochef.
Thanks.
Finally I came to conclusion to use the mongo using terminal which the best way to use(efficient).
read about the primary and secondary databases and executed the following query:
mongoexport --username user --password pass --host host --db database --collection coll --out file_name.json

IBM WCS and DB2 : Want to export all catentries data from one DB and import into another DB

Basically i have two environments, Production and QA. On QA DB, the data is not same as it is on production so my QA team is unable to test it properly. So want to import all catentries/products related data in QA DB from production. I searched a lot but not found any solution regarding this.
May be i need to find all product related tables and export them one by one and then import into dev db but not sure.
Can anyone please guide me regarding this. How i can do this activity with best practices?
I am using DB2
The WebSphere Commerce data model is documented, which will help you identify all related tables. You can then use the DB2 utility db2move to export (and later load) those tables in one shot. For example,
db2move yourdb export -sn yourschema -tn catentry,catentrel,catentdesc,catentattr
Be sure to list all tables you need, separated by commas with no spaces. You can specify patterns to match table names:
db2move yourdb export -sn yourschema -tn "catent*,listprice"
db2move will create a file db2move.lst that lists all extracted tables, so you can load all data with:
db2move yourQAdb load -lo replace
running from the same directory.

How to copy some of the data from one MongoDB to another

I have an existing MongoDB dump and I would like to cherry pick some of the data to a clean DB.
Is dumping a single collection and restoring them (mongodump & mongorestore) the way to do this?
You can to this by using the --filter '<JSON>' option on mongorestore.
That's like the first argument of db.find().
If you just want to filter by collection --collection <collection>
See more info in the doc