mongoimport cannot find file in (public) GCS bucket - mongodb

We have a newline delimited JSON file saved in a public bucket in GCS:
Shows as public to internet. Hopefully one of the following 3 links finds the JSON on your end:
https://storage.googleapis.com/cbb-staging/division_info
https://storage.cloud.google.com/cbb-staging/division_info
gs://cbb-staging/division_info
We are trying to import this JSON into our MongoDB cluster using mongoimport. Our MongoDB URI string is correct, however we are struggling to point to the file in GCS.
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file https://storage.googleapis.com/cbb-staging/division_info
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file https://storage.cloud.google.com/cbb-staging/division_info
mongoimport --uri "mongodb+srv://UserName:Password#our-cluster.abcde.gcp.mongodb.net/dbname" --collection staging__text_export --drop --file gs://cbb-staging/division_info
All 3 of these return the similar error Failed: open https://storage.cloud.google.com/cbb-staging/division_info.json: no such file or directory. We tried adding .json to the end of the file names and it did not help.
Is this possible to do?

Here's a screenshot from MongoDB Atlas Support confirming what Rajdeep has said in the comments.

Related

Is there any way to import JSON zip file into mongodb using mongoimport?

I have created a zip file from a large JSON file(containing json array). I want to use this zip file in mongoimport command. Is it possible to import this zip file in mongodb using mongoimport command?
COMMAND:
mongoimport --db test --collection inventory ^ --authenticationDatabase admin --username <user> --password <password> ^ --drop --file ~\downloads\inventory.crud.json.zip --jsonArray
OUTPUT:
Failed: error reading separator after document #1: bad JSON array
format
Since this is a zip file it does not find a valid json array. Is there a way to unzip file in mongoimport command?

Cant import data into mongo database using mongoimport

I can't import data into my existing database located on mongoDBAtlas. I installed and connected robomongo with mongoDBAtlas for working with atlas.
I created new database jasper and collection User in robomongo then
I created user.json file in my project where are stored my data.
I followed tutorial on https://docs.atlas.mongodb.com/import/mongoimport/ - how to use mongoimport with mongodb.
Here is my command, Im typing in terminal:
mongoimport --uri mongodb://Morty:<PASSWORD>#jasper-shard-00-00-mrihb.mongodb.net:27017/jasper?ssl=true&replicaSet=jasper-shard-0&authSource=admin --collection User --drop --file ./src/data/user.json --jsonArray
that give me an error:
[1] 40930
[2] 40931
-bash: --collection: command not found
[2]+ Done replicaSet=jasper-shard-0
KSC1-LMC-K00587:Interview-test-part-one marze$ 2017-10-15T10:38:35.209+0200 no collection specified
2017-10-15T10:38:35.209+0200 using filename '' as collection
2017-10-15T10:38:35.209+0200 error validating settings: invalid collection name: collection name cannot be an empty string
2017-10-15T10:38:35.209+0200 try 'mongoimport --help' for more information
If I run mongoimport for localhost it works perfectly.
Where should be the problem ?
Solution:
-use quotes for uri param.
mongoimport --uri "mongodb://Morty:<PASSWORD>#jasper-shard-00-00-mrihb.mongodb.net:27017/jasper?ssl=true&replicaSet=jasper-shard-0&authSource=admin" --collection User --drop --file ./src/data/user.json --jsonArray

I want to import the json file only if they don't exist

I am using mongo 3.4
I want to import json file from json array to mongod using bash script, and I want to import the json file only if they don't exist. I tried with --upsert but it does not work.
Is there any easy way to do it? Thanks
mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray --upsert
mongoimport -d dbName -c collectionName jsonFile.json -vvvvv
Even though the output of mongoimport says that n of objects were imported, the exsiting document with same data has not been overwritten.
if use --upsert it will update the existing document.
Found similar discussion here

Cannot import example dataset (the system cannot find the specified file)

I am following the example given on MongoDB's website here, but I am running into trouble when trying to import sample data.
When running the command
mongoimport --db test --collection restaurants --drop --file primer-dataset.json
I get the error:
Failed: open primer-dataset.json: The system cannot find the file specified
The problem is, I am not sure what directory MongoDB expects this file to be in. I tried placing it in data/db, but that did not work. Note that I am only using default settings.
I know this is a somewhat trivial question and I feel stupid for asking it but I can not find documentation on this anywhere. Where is MongoDB expecting import files?
MongoDB expects the file to be in the directory from where you are running the command mongoimport.
If you place your file under data/db then set mongodb path as global environment variable and execute the command from data/db directory.
Additionally if you have security enabled for your mongodb then you need to execute command as below
mongoimport --username admin --password password --db test --collection restaurants --drop --file primer-dataset.json
here admin is the user authorized to perform db operations for test database and restaurants is the collection name.
For Windows!
Save file using notepad++ in .json format at MongoDB/bin where mongoimport command is present.
Simple notepad has trouble doing it.
It happened to me as well. The issue was that though the file was visible as restaurants.json actually the file was restaurants.json.json (since saved in JSON format). The issue was resolved after properly changing the name.
i have trouble like you, check you path to file, mongoimport.exe and your file may be stay in another folders.
use mongoimport -d test1 -c restaraunts companies.json for import to mongodb.
Check the filename extension, and make sure it's a ".json" file;
After this I successfully run the
mongoimport --db test --collection restaurants --drop --file [path\to\Json file]
command;
In my case, I removed the --drop parameter and it worked perfectly. I guess, it is throwing this error:
Failed: open paht/file-name.json: The system cannot find the file specified.
because the collection it wants to drop is not available, because I have not created any before.
you must copy your json file into C:\Windows\System32 and write this command on cmd:
mongoimport --db test --collection mongotest --type json --file yournamefile.json

MongoDB import error assertion 9998

I seem to keep having this error when i try and import anything?
In terminal I input:
name:~ computer$ mongoimport --db users --collection contacts --type csv --file /Users/computer/Desktop/ftse100.csv
connected to: 127.0.0.1
assertion: 9998 you need to specify fields
I wouldn't know what to ask. I tried adding --field after this command line but just get help information.
ER
As per mongodb docs
--fields <field1[,field2]>, -f
Specify a comma separated list of field names when importing csv or tsv files that do not have field names in the first (i.e. header) line of the file.
mongoimport --db users --collection contacts --type csv --file /Users/computer/Desktop/ftse100.csv --fields field1, field2,field3
As per your question, there is a typo it's not --field instead --fields
In 2.4.6, mongoimport does not find the header in csv files that I make, with or without double quote boundaries.
If I chop off the header line and supply that same text to the -f or --fields option, it my files import fine.
If you want to add all columns, use --headerline option instead of -fields.
In your case it would be:
mongoimport --db users --collection contacts --type csv --headerline --file /Users/computer/Desktop/ftse100.csv