Cannot insert into mongo using mongoImport - mongodb

I am trying to insert a json file into mongo using mongo import
this is my first try:
mongoimport --file someorders.json --type json --db test --collection someorders
I get the following error:
exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0
Then I tried:
mongoimport --file someorders.json --type json --db test --collection someorders --jsonArray
And the I get the following error:
exception:JSONArray file too large
warning: log line attempted (16384k) over max size(10k), printing beginning and end ... {
... prints some of the json
ERROR: encountered 1 error(s)

Related

How to import a large json file into Mongodb?

I have a big json file (~ 300GB) which is composed of many of dicts in it and I am trying to import this file into MongoDB. The method that I tried was mongoimport by using this:
mongoimport --db <DB_NAME> --collection <COLLECTION_NAME> --file <FILE_DIRECTORY> --jsonArray --batchSize 1
but it shows the error something like this after some insertions Failed: error processing document #89602: unexpected EOF. I have no idea why it happens.
Any other methods to make it work?

Failed: line1, column 134: extraneous " in field Error while importing csv file in MongoDb

I'm new to mongodb and while trying to import csv file using the syntax below it returns the error below.The first lines of my csv file is as follows:
Source,"ID","Date","Timestamp","Author","Author ID","Longitude","Latitude","Likes","Comments","Retweets","Text"
Can anyone kindly help me.Thank you in advance ! error output image
C:\Program Files\MongoDB\Server\3.2\bin>mongoimport --db mydatabase
--collection sites --type csv --headerline --file C:\mydata\sample.csv
Output Error
2016-07-06T16:07:51.395+0200 Failed: line 1, column 134:
extraneous " in field
2016-07-06T16:07:51.397+0200 imported 0 documents
It is better Convert CSV file into JSON format
or
import JSON instead of CSV.
To import JSON as collection of objects then
Example:
JSON1 :
[{"project":"project_1","status":"yes","priority":7},{"project":"project_2","status":"yes","priority":7},{"project":"project_3","status":"yes","priority":7}]
mongoimport --db myDB--collection myCollection --file battles.json --jsonArray
Imprort as single object
JSON2 :
[{"mydata":[{"project":"project_1","status":"yes","priority":7},{"project":"project_2","status":"yes","priority":7},{"project":"project_3","status":"yes","priority":7}]}]
mongoimport --db myDB--collection myCollection --file battles.json
Ref:
Failed: error unmarshaling bytes on document #0: JSON decoder out of sync - data changing underfoot?
jsonArray import
error invalid character

Too many positional options mongoimport Error

I have a comma separated csv file with data for following fields: Train_ID ,Train_Number, Train_name
I wrote the following query to import the data from csv to mongodb:
mongoimport --db test --collection csvimporting --type csv --file "C:/Darshil Babel/Desktop/sample_data.csv" --fields Train_ID,Train_Number,Train_Name
It is giving following error:
Error parsing command line: too many positional options
What am I doing wrong?

mongoimport error: Unexpected identifier

When I try to import my json data file into my local instance of mongodb, I get an error. The code that I am using is shown below.
> mongoimport --db cities --collection zips --type json --file C:/MongoDB/data/zips.json
This is the error that I get.
2014-11-29T20:27:33.803-0800 SyntaxError: Unexpected identifier
what seems to be to problem here?
I just found out that mongoimport is used from terminal/command line(cmd), and NOT within the mongo shell.

how to import large json file into mongodb using mongoimport?

I am trying to import the large data set json file into mongodb using mongoimport.
mongoimport --db test --collection sam1 --file 1234.json --jsonArray
error:
2014-07-02T15:57:16.406+0530 error: object to insert too large
2014-07-02T15:57:16.406+0530 tried to import 1 objects
Please try add this option: --batchSize 1
Like:
mongoimport --db test --collection sam1 --file 1234.json --batchSize 1
Data will be parsed and stored into the database batchwise