I tried to import a simple json file using mongoimport and i get the following error
PER-MacBook-Pro:/AJ$ mongoimport --db test --collection samplePM --file /users/AJ/Documents/Development/ETLwork/Dummydata/Penguin_Players.json
2015-06-16T09:53:57.291-0400 connected to: localhost
2015-06-16T09:53:57.293-0400 Failed: error processing document #1: invalid character '\\' looking for beginning of object key string
2015-06-16T09:53:57.293-0400 imported 0 documents
Sample json file is as follows:
{
"position":"Right Wing",
"id":8465166,
"weight":200,
"height":"6' 0\"",
"imageUrl":"http://1.cdn.nhle.com/photos/mugs/8465166.jpg",
"birthplace":"Seria, BRN",
"age":37,
"name":"Craig Adams",
"birthdate":"April 26, 1977",
"number":27
},
{
"position":"Right Wing",
"id":8475761,
"weight":195,
"height":"6' 2\"",
"imageUrl":"http://1.cdn.nhle.com/photos/mugs/8475761.jpg",
"birthplace":"Gardena, CA, USA",
"age":23,
"name":"Beau Bennett",
"birthdate":"November 27, 1991",
"number":19
}
Am I doing something wrong here?
I was able to get away with using --jsonArray tag, giving the file full path, and modified it adding bracket at the beginning and at the end,
[{"my":"json","file":"imported"},{"my":"cool","file":"succeeded"}]
mongoimport --db myCoolDb --collection myCoolColl --file /path/to/my/imported/file.json --jsonArray
The comment about non "UTF-8" characters was helpful.
It seems like there is a problem with creating json documents using textedit in Mac. I could not find these non UTF-8 characters but i created the same file using vi test.json in mac shell. I pasted the contents, saved the file and used mongoimport. It works now.
Thanks
I got the same error while importing json data. Instead use the .bson data using mongorestore command.
mongorestore -d <db> -c <collection> <.bson-file>
Use --drop if you want to drop the existing data in the collection.
I was getting Failed: error processing document #112783: invalid character ',' looking for beginning of value because one of my objects was formatted improperly. Notice how "psychosurgery" is missing curly braces:
{
"word": "psychosurgeons",
"firstLetter": "p"
}
" psychosurgery",
{
"word": "psychosurgical",
"firstLetter": "p"
}
Since there are over 600,000 lines in the file I'm trying to import, this would have been tough to find manually.
So I ran the same mongoimport command with full verbosity (-vvvvv) enabled, and the script stopped right on the problematic object. See mongoimport --help for more info.
Hope this helps someone.
I got the same problem because I used texteditor on the mac. The solution was to convert the file to plain text. Make sure the extension ends in .json because texteditor wants to put .txt at the end.
Just open a text file, copy all the data to the newly created text file. While saving the text file select the option 'UTF-8' in the Encoding drop down and later change the text file to JSON or CSV by renaming.
Then import the file as usual as it is.
if you used mongoexport to download use mongoimport to upload
or if you use mongodump to download use mongorestore to upload
because i used to download with mongodump and tried to upload with mongoimport i was getting error processing document #1: invalid character '\u008c'error after i tried with mongorestore it was fine
Related
I want to test MongoDB as a possible alternative to my file system set-up. I have 3 folders, two hold JSON data (so no problem there), but one holds .lic and .licx files. I simply want to store and retrieve these files easily from a MongoDB collection in a database. I'm testing on the command line... How would I insert a .licx file into a collection that is in a database?
I've tried a command line argument
I've read a bit about gridFS but no clear example of how to use it.
--db license-server --collection licenses --type BSON --file C:\Users\<myname>\Desktop\<projectname>\private\licenses\<filename>.licx
I expect the licx file to be inserted into the collection with an id so I can retrieve it later.
I'm getting: error validating settings: unknown type bson as an error for the command line command.
To insert a document that's bigger that 16MB or has an extension like .licx for example, run the command
mongofiles -d license-server put <filename(includingfullpath)>.licx
this will store the file in the fs.files and fs.chunks collections within your database.
To retrieve the file on the command line use
mongofiles -d license-server get <filename(includingfullpath)>.licx
Additional Documentation can be found here:
https://docs.mongodb.com/manual/reference/program/mongofiles/#bin.mongofiles
I'm facing an issue while trying to import .json files that were exported using the mongoexport command.
The generated .json files contains the character $ in some variables such as $oid and $numberLong().
{"_id":{"$oid":"55aff0e7b3bdf92b314b6fa6"},"activated":true,"authRole":"USER","authToken":"5bdad308-4a11-4890-8c3e-82c29530f1bc","birthDate":{"$date":"2015-08-06T03:00:00.000Z"},"comercialPhone":"99999994","email":"test#mail.com","mobilePhone":"99999999","name":"Test Test","password":"$2a$10$y","validationToken":"b2cd0d71-cb47-405d-bf7f-e46e1a8706e4","version":{"$numberLong":"35"}}
However, this format is not acceptable while importing the files. This format seems to be the strict mode, but I'd like to generate json files using the shell format which shows $oid as ObjectId.
Is there any workaround for this?
Thanks!
Is there a way from the command line that I can dump a MongoDb database to a javascript file that can be interpreted by the mongo shell? I am looking for a way to do exactly what the RockMongo Export function does, but I need to be able to call it from a command line script. I've looked everywhere for something that does this but all I can seem to find is mongoexport and mongodump which don't seem to do what I want, as these just create JSON files.
The reason I need to do this is because codeception's MongoDb module requires a file in this format to restore the database after each test. I want to write a script to automate this process so that I don't have to constantly go through RockMongo and generate the dump.
Thanks in advance!
In case anyone else happens to find this, I finally found a solution that works for my scenario. I had to take Markus' suggestion and kind of roll my own solution, but I discovered a mongodb command called bsondump that made things much easier.
So in my script I first use mongodump to create a BSON file of my collection
mongodump --db mydb --collection mycollection --out - > mycollection.bson
I then use bsondump to convert that into JSON that can be used in Shell Mode
bsondump mycollection.bson > mycollection.json
Finally, I'm using PHP so in my PHP script I loop through that json file and wrap each line in an insert statement.
$lines = file('mycollection.json');
$inserts = [];
foreach($lines as $line)
{
$inserts[] = 'db.getCollection("mycollection").insert(' . trim($line) . ');' . PHP_EOL;
}
file_put_contents('output.js', $inserts);
I'm guessing there is probably a better way to do this, but so far this seems to be working nicely for me. Thanks for steering me in the right direction Markus!
Is there any method to insert CSV file into Mongo DB other than using mongoimport tool ? . I need to perform bulk insertions in mongoDB . I referred some sites and i found that there are some issues in using mongo import tool for importing large set of data.Please enlighten me how to insert CSV into mongoDB from application directly ? I need to know if there are any methods or wrappers in c++ or java for inserting CSV into MongoDB. Thanks in advance
the MongoDB shell is able to evaluate JavaScript. So you could write your own parser in JavaScript load your program into the shell and start it. If your don't like JavaScript, there are a lot of drivers for other programming languages where you could load your file and use the driver to insert your data into the database.
you can use bulk option available with mongo2.6
read the file
iterate and save to some variable
bulk.excecute.
mongoimport -c 'check' --db 'test' --type csv --file test.csv --headerline
use mongoimport --help for more help
(edit : -h option is for host)
test.csv
name, age, sex
John, 23, M
I am trying to insert a huge(~831M) file into mongo collection using mongoimport
/Library/mongodb/bin/mongoimport --port 12345 -d staging -c collection < out.all.1
and see some errors like
exception:Failure parsing JSON string near: , 'Custome
and there are instances where I found some weird characters
'CustomerCity': u'Wall \xa0'
'CustomerCity': u'La Ca\xc3\xb1ada Flintridge'
'CustomerCity': u'La Ca\xf1ada Flintridge'
How do I resolve these issues?
Thank you
I struck a similar problem where mongoimport gave errors about non-UTF8 characters in a flat file I'd asked it to import. This google groups thread led me to try putting my source data file through iconv on the unix command line to 'correct' non-UTF-8 characters, thus:
iconv -f ISO-8859-1 -t UTF-8 inputfile.txt > outputfile.txt
That solved the issue for me. I wonder if that approach might help you? While the error you're seeing is different, it's the odd characters that are messing up the JSON parsing, no?
One wonders, however, how those odd characters are ending up in your output data if you're generating it yourself. Perhaps you could filter in the code that generates the output?