We have a newline delimited JSON file saved in a public bucket in GCS:
Shows as public to internet. Hopefully one of the following 3 links finds the JSON on your end:
https://storage.googleapis.com/cbb-staging/division_info
https://storage.cloud.google.com/cbb-staging/division_info
gs://cbb-staging/division_info
We are trying to import this JSON into our MongoDB cluster using mongoimport. Our MongoDB URI string is correct, however we are struggling to point to the file in GCS.
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file https://storage.googleapis.com/cbb-staging/division_info
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file https://storage.cloud.google.com/cbb-staging/division_info
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file gs://cbb-staging/division_info
All 3 of these return the similar error Failed: open https://storage.cloud.google.com/cbb-staging/division_info.json: no such file or directory. We tried adding .json to the end of the file names and it did not help.
Is this possible to do?
Here's a screenshot from MongoDB Atlas Support confirming what Rajdeep has said in the comments.
I successfully followed commands to import from csv file to mongodb using CLI.
mongoimport --host myhost --ssl --username my_username --password my_password --authenticationDatabase admin --db my_db --collection addresses --type csv --fields Geocords,Display_lat,Display_lng,street_address,added_by_user --file exportdir2/MFCustomerList.csv
Now the object ID field is not something I have in my file. I expected each row to get its own object ID created on its own as I upload. However instead of just having an object id in there as so: id_:"kalaklakp", each row has something like this in there: id_:ObjectId("gaafagafagafs")
This is causing major problems especially for my Parse instance , It is not able to identify these objects as valid.
I am following the example given on MongoDB's website here, but I am running into trouble when trying to import sample data.
When running the command
mongoimport --db test --collection restaurants --drop --file primer-dataset.json
I get the error:
Failed: open primer-dataset.json: The system cannot find the file specified
The problem is, I am not sure what directory MongoDB expects this file to be in. I tried placing it in data/db, but that did not work. Note that I am only using default settings.
I know this is a somewhat trivial question and I feel stupid for asking it but I can not find documentation on this anywhere. Where is MongoDB expecting import files?
MongoDB expects the file to be in the directory from where you are running the command mongoimport.
If you place your file under data/db then set mongodb path as global environment variable and execute the command from data/db directory.
Additionally if you have security enabled for your mongodb then you need to execute command as below
mongoimport --username admin --password password --db test --collection restaurants --drop --file primer-dataset.json
here admin is the user authorized to perform db operations for test database and restaurants is the collection name.
For Windows!
Save file using notepad++ in .json format at MongoDB/bin where mongoimport command is present.
Simple notepad has trouble doing it.
It happened to me as well. The issue was that though the file was visible as restaurants.json actually the file was restaurants.json.json (since saved in JSON format). The issue was resolved after properly changing the name.
i have trouble like you, check you path to file, mongoimport.exe and your file may be stay in another folders.
use mongoimport -d test1 -c restaraunts companies.json for import to mongodb.
Check the filename extension, and make sure it's a ".json" file;
After this I successfully run the
mongoimport --db test --collection restaurants --drop --file [path\to\Json file]
command;
In my case, I removed the --drop parameter and it worked perfectly. I guess, it is throwing this error:
Failed: open paht/file-name.json: The system cannot find the file specified.
because the collection it wants to drop is not available, because I have not created any before.
you must copy your json file into C:\Windows\System32 and write this command on cmd:
mongoimport --db test --collection mongotest --type json --file yournamefile.json
I'm facing a problem in importing a CSV file to my db. I tried this command:
mongoimport --db meteor -h localhost:3001 -c TollPlaza -d meteor --headerline --type csv --file TollPlaza.csv
I referred to this question but am still having a problem.
Try this query :
mongoimport --db meteor --host localhost --port 3001 -c TollPlaza -d
meteor --headerline --type csv --file TollPlaza.csv
Maybe seperate host and port
The query is perfectly fine.
Follow the instructions
run the meteor project
Open a one more command prompt in same location
make sure that csv file present in the same directory.
run the query.
check the DB (show collections).
I want to export the whole data from one of my collections in mongodb to a csv file.
This is how my db looks like:
"Tags",
"checkouts",
"imports",
"products"
I am trying to export "checkouts".
I am connected to the db through the terminal and I tried the following commands:
mongoexport --db nameofdatabase --collection checkouts --out coll.json
mongoexport --db nameofdatabase --collection checkouts --type=csv --fields --out /opt/backups/contacts.csv
The error message I get after running both commands command is : "Unexpected identifier"
MongoExport cannot be used in the mongo shell. It is a terminal command like mongo.