Bash script which uploads (small) images as documents into MongoDB? - mongodb

I have a directory with loads of small images (say png icons) which I want to store in my MongoDB without using GridFS, ideally using a bash script and the mongo shell alone. From what I read, that seems technical feasible, but I did not yet manage to get it working. The official documentation but the recipe seems incomplete.
According to Christoph Menge, GridFS is indeed not absolutely necessary for e.g. icons only.
Assume, I have a image file called foo.png which I would like to persist into the database icons, the GridFS solution would be this:
mongofiles -u "uploader" -p "notsogreat" --authenticationDatabase "admin" -d icons put foo.png
But how do I do that without GridFS within bash?
base64 foo.png > mongo -u "uploader" -p "notsogreat" --authenticationDatabase "admin" [SOME_MAGIC]
StackOverflow pages I consulted (in no particular order):
Store images in a MongoDB database
Store images in MongoDB
How can I add images in mongoDB?
Read image file into a MongoDB document's binary field via mongo shell script
How do I insert a binary file into mongodb using javascript?
Redirecting the output to a text file in MongoDB
How to add binary data with objectId to mongoDB?
Read a file from a mongo shell
Further reference(s):
https://itknowledgeexchange.techtarget.com/itanswers/how-to-store-images-in-a-mongodb-database/

Related

How to export the database of the mongodb with gzip file extension

I'm using mongodb for saving the data for my application and I want to backup of that database in gzip file I searched for it and I found question posted by the other users
link https://stackoverflow.com/questions/24439068/tar-gzip-mongo-dump-like-mysql
link https://stackoverflow.com/questions/52540104/mongodump-failed-bad-option-can-only-dump-a-single-collection-to-stdout
I used these commands but that will not me the expected output I want the command that will create my database gzip compress file and using extraction I will restore that database folder into the mongodb
currently I'm using this below command
mongodump --db Database --gzip --archive=pathDatabase.gz
which will create a compression of .gz while I extract it it will show me nothing.
Can you please give me a command that I will use it or any suggestions will appreciated.
When you use mongodump --db Database --gzip --archive=pathDatabase.gz You will create 1 archive file (it does not create a folder) for the specified DB and compress it with gzip. Resulting file will be pathDatabase.gz in your current directory.
To restore from such file, you'd do this
mongorestore --gzip --archive=pathDatabase.gz
This will restore the db "Database" with all its collection.
You can check out these MongoDB documentation pages for more info
Dump: https://docs.mongodb.com/manual/reference/program/mongodump/
Restore: https://docs.mongodb.com/manual/reference/program/mongorestore/
Edit: Removed --db flag from restore command as it is not supported when used with --archive.
mongodump --archive=/path/to/archive.gz --gzip will actually create an archive which interleaves the data from all your collections in a single file. Each block of data is then compressed using gzip.
That file can not be read by any other tool than mongorestore, and you need to use identical flags (i.e. mongorestore --archive=/path/to/archive.gz --gzip), which you can use to restore your dump on another deployment.
The resulting archive can not be extracted using gunzip or tar.
If you need to change the target namespace, then you should use the --nsFrom, --nsTo and --nsInclude options in order to use a different database name.

How to migrate RethinkDb into MongoDb?

My application is using RethinkDb. Everything is running fine, but a new required needs to migrate the db into MongoDb.
Is this possible? How do I migrate the tables/collections, data, indexes, etc?
How about blob types, auto increments. ids?
Thanks!
Is this possible? How do I migrate the tables/collections, data, indexes, etc?
One way to migrate data from RethinkDB to MongoDB is to export data from RethinkDB using rethinkdb dump command, and then use mongoimport to import into MongoDB. For example:
rethinkdb dump -e dbname.tableName
This would generate an archive file:
rethinkdb_dump_<datetime>.tar.gz
After uncompressing the archive file, you can then use mongoimport as below:
mongoimport --jsonArray --db dbName --collection tableName ./rethinkdb_dump_<datetime>/dbName/tableName.json
Unfortunately for the indexes, the format between RethinkDB and MongoDB is quite different. The indexes are stored within the same archived file:
./rethinkdb_dump_<datetime>/dbName/tableName.info
Although you can still write a Python script to read the info file, and use MongoDB Python driver (PyMongo) to create the indexes in MongoDB. See also create_indexes() method for more information.
One of the reasons in suggesting to use Python, is because RethinkDB also has a Client Python driver. So technically, you can also skip the export stage and write a script to connect your RethinkDB to MongoDB.

How to add Files on my computer into local mongodb collection

I have a folder in my computer containing a list of files in json and bson format to be added into my local mongodb. The name of the db is sahaj_dev. This is the image of the list of files on my computer
I have to add all these files to my sahaj_dev database as collections of that database.
How can I do it. I am not sure whether to use mongoimport or mongostore. I am new to mongodb. Kindly help me out with the command to be used. Thanks. This is the image of database to which I have to add all the files
I found the answer for it. I have to use the mongorestore command. The syntax of it is
mongorestore -d <db_name> <location of the folder>
This would restore all the files in the folder and create respective collections for it in the local mongoDB

Exporting specific gridfs files from MongoDB

I have large number of files in database,I need to take backup of specific week files and export to another database.I can dump fs.files based on uploadDate.
./mongodump --port <port> --db <Database> --collection fs.files --query <json> --out <destination>
How can I export the specific fs.chunks data while iterating fs.files in the shell?
Here's a blog post and the gist for a bash script that will do what's asked for here. Run this script from the command line of the mongodb server. It will loop through the fsfiles collection and exporting the files using the mongofiles utility that is included with MongoDB.
Original Blog Post
Gist

How to fetch a file using objectid in mongodb?

I have a file named myFile.png in image database. Is there a way to fetch it from the database using it's objectId rather than its filename? I currently use,
mongofiles -db image -u db -p pwd get myFile.png
to get my image file.