mongoimport Error, Failed: invalid JSON input - mongodb

I have many mongodb instance files like below;
[
{
"_id" : ObjectId("35d455de983c0e6a53ea0848"),
"createdAt" : ISODate("2019-12-05T23:25:04.347+0000"),
"__v" : NumberInt(0)
},
{
"_id" : ObjectId("1ecbe0f75df8ccd52a7b1662"),
"createdAt" : ISODate("2019-12-17T12:40:53.521+0000"),
"__v" : NumberInt(0)
}
]
I couldn't import these files because of invalid format.
mongoimport --db DATABASENAME --collection COLLECTIONNAME --file filename.json --jsonArray
And it says;
Failed: invalid JSON input. Position: 16. Character: O
Is there any other way to import those files?
If not, how can I convert them to be imported?

you can use --legacy option of mongoimport in order to import the file with the format you want with ObjectId("an oid"), NumberInt(an int) or new Date("an iso date" or any mongo extension available in extended json v1 : https://docs.mongodb.com/manual/reference/mongodb-extended-json-v1/)

Solution 1 (semi-auto):
We can import database only in json/csv format so remove all invalid characters such as ObjectId, ISODate, NumberInt, ()...
Solution2 (auto):
Finally, I found an alternative solution.
We can't import such data even in Mongo Compass.
But fortunately, we can import/export database in various formats in Studio 3T for MongoDB.
I think Studio 3T is better than Mongo Compass, but it is not free.
After 30 days, we can't use Studio 3T trial version.
I will be very thankful if you let me know a way to reset Studio 3T.

invalid JSON input. Position: 96. Character: N
For me it was "N" for NaN that I replaced with 0 and it wroked.
It also worked with adding the --legacy option, but I preferred to fix the file as the legacy option did not work within my docker container.

I had same problem, the error was about a property called '$init' that was added by mongoose automatically.
I removed all those by Sublime and imported again by 3T.

Related

Rename MongoDB 4.0.4 Database Name

Getting below error :
db.copyDatabase("old_db","new_db","localhost:27017");
WARNING: db.copyDatabase is deprecated. See http://dochub.mongodb.org/core/copydb-clone-deprecation
{
"note" : "Support for the copydb command has been deprecated. See http://dochub.mongodb.org/core/copydb-clone-deprecation",
"ok" : 1
}
https://docs.mongodb.com/manual/release-notes/4.0-compatibility/#copydb-and-clone-commands
I went to this link, but there's no solution regarding this.
Any leads would be appreciated.
Use mongodump and mongorestore or write a script using the drivers.

How to export JSON from MongoDB using Robo 3T

I am using Robo 3T (formerly RoboMongo) which I connect to a MongoDB. What I need to do is this: There is a collection in that MongoDB. I want to export the data from that collection so that I can save it into a file.
I used the interface to open the data from the collection as text and did a Ctrl + A and pasted into a text file. However, I found that not all data is copied and also that there were many comments in the text data which naturally breaks the JSON.
I am wondering if Robo 3T has a "Export As JSON" facility so that I can do a clean export.
Any pointers are appreciated!
A quick and dirty way: Just write your query as db.getCollection('collection').find({}).toArray() and right click Copy JSON. Paste the data in the editor of your choice.
You can use tojson to convert each record to JSON in a MongoDB shell script.
Run this script in RoboMongo:
var cursor = db.getCollection('foo').find({}, {});
while(cursor.hasNext()) {
print(tojson(cursor.next()))
}
This prints all results as a JSON-like array.
The result is not really JSON! Some types, such as dates and object IDs, are printed as JavaScript function calls, e.g., ISODate("2016-03-03T12:15:49.996Z").
Might not be very efficient for large result sets, but you can limit the query. Alternatively, you can use mongoexport.
Robomongo's shell functionality will solve the problem. In my case I needed couple of columns as CSV format.
var cursor = db.getCollection('Member_details').find({Category: 'CUST'},{CustomerId :1,Name :1,_id:0})
while (cursor.hasNext()) {
var record = cursor.next();
print(record.CustomerID + "," + record.Name)
}
Output : -------
334, Harison
433, Rechard
453, Michel
533, Pal
you say "export to file" as in a spreadsheet? like to a .csv?
IMO this is the EASIEST way to do this in Robo 3T (formerly robomongo):
In the top right of the Robo 3T GUI there is a "View Results in text
mode" button, click it and copy everything
paste everything into this website: https://json-csv.com/
click the download button and now you have it in a spreadsheet.
hope this helps someone, as I wish Robo 3T had export capabilities
There are a few MongoDB GUIs out there, some of them have built-in support for data exporting. You'll find a comprehensive list of MongoDB GUIs at http://mongodb-tools.com
You've asked about exporting the results of your query, and not about exporting entire collections. Give 3T MongoChef MongoDB GUI a try, this tool has support for your specific use case.
Don't run this command on shell, enter this script at a command prompt with your database name, collection name, and file name, all replacing the placeholders..
mongoexport --db (Database name) --collection (Collection Name) --out (File name).json
It works for me.
I don't think robomongo have such a feature.
So you better use mongodb function as mongoexport for a specific Collection.
http://docs.mongodb.org/manual/reference/program/mongoexport/#export-in-json-format
But if you are looking for a backup solution is better to use
mongodump / mongorestore
If you want to use mongoimport, you'll want to export this way:
db.getCollection('tables')
.find({_id: 'q3hrnnoKu2mnCL7kE'})
.forEach(function(x){printjsononeline(x)});
Expanding on Anish's answer, I wanted something I can apply to any query to automatically output all fields vs. having to define them within the print statement. It can probably be simplified but this was something quick & dirty that works great:
var cursor = db.getCollection('foo').find({}, {bar: 1, baz: 1, created_at: 1, updated_at: 1}).sort({created_at: -1, updated_at: -1});
while (cursor.hasNext()) {
var record = cursor.next();
var output = "";
for (var i in record) {
output += record[i] + ",";
};
output = output.substring(0, output.length - 1);
print(output);
}
Using a robomongo shell script:
//on the same db
var cursor = db.collectionname.find();
while (cursor.hasNext()) {
var record = cursor.next();
db.new_collectionname.save(record);
}
Using mongodb's export and import command
You can add the --jsonArray parameter / flag to your mongoexport command, this exports the result as single json array.
Then just specify the --jsonArray flag again when importing.
Or remove the starting and ending array brackets [] in the file, then your modified & exported file will import with the mongoimport command without the --jsonArray flag.
More on Export here: https://docs.mongodb.org/manual/reference/program/mongoexport/#cmdoption--jsonArray
Import here:
https://docs.mongodb.org/manual/reference/program/mongoimport/#cmdoption--jsonArray
Solution:
mongoexport --db test --collection traffic --out traffic.json<br><br>
Where:
database -> mock-server
collection name -> api_defs
output file name -> childChoreRequest.json
An extension to Florian Winter answer for people looking to generate ready to execute query.
drop and insertMany query using cursor:
{
// collection name
var collection_name = 'foo';
// query
var cursor = db.getCollection(collection_name).find({});
// drop collection and insert script
print('db.' + collection_name + '.drop();');
print('db.' + collection_name + '.insertMany([');
// print documents
while(cursor.hasNext()) {
print(tojson(cursor.next()));
if (cursor.hasNext()) // add trailing "," if not last item
print(',');
}
// end script
print(']);');
}
Its output will be like:
db.foo.drop();
db.foo.insertMany([
{
"_id" : ObjectId("abc"),
"name" : "foo"
}
,
{
"_id" : ObjectId("xyz"),
"name" : "bar"
}
]);
I had this same issue, and running script in robomongo (Robo 3T 1.1.1) also doesn't allow to copy values and there was no export option either.
The best way I could achieve this is to use mongoexport, if mongodb is installed on your local, you can use mongoexport to connect to database on any server and extract data
To connect to Data on remote server, and csv output file, run the following mongoexport in your command line
mongoexport --host HOSTNAME --port PORT --username USERNAME --password "PASSWORD" --collection COLLECTION_NAME --db DATABASE_NAME --out OUTPUTFILE.csv --type=csv --fieldFile fields.txt
fieldFile: helps to extract the desired columns, ex:
contents of fields.txt can be just:
userId
to only extract values of the column 'userId'
Data on remote server, json output file:
mongoexport --host HOST_NAME --port PORT --username USERNAME --password "PASSWORD" --collection COLECTION_NAME --db DATABASE_NAME --out OUTPUT.json
this extracts all fields into the json file
data on localhost (mongodb should be running on localhost)
mongoexport --db DATABASE_NAME --collection COLLECTION --out OUTPUT.json
Reference: https://docs.mongodb.com/manual/reference/program/mongoexport/#use
Simple solution:
tostrictjson(db.getCollection(collection_name).find({}))
Note:
Other solutions are fine but might cause errors during import when your collection has types like Date, ObjectId etc...
Happy Hacking :)
I export using Mongodb Compass, you can export to csv or json.
On the menu of Mongo Compass select Collection-> export collection, and you can select the fields to export, and the file to export the result, previously you can specify the query.
Regards
make your search
push button view results in JSON mode
copy te result to word
print the result from word

How do I Import a CSV into a nested document on mongodb

For Example how do I import this
---> test.csv
tesingImport ,hi there
---> What I tried
./mongoimport -d channeladvisor -c products --type csv --file ./test.csv -fields Sku, a.b.c
somehow I get this imported
{
"_id" : ObjectId("53e6eb0eeb5228df491a0f50"),
"Sku" : "tesingImport",
"a.b.c" : "hi there"
}
I can make a script to do this but I wasn't sure if I could use the import to make it faster
Unfortunately, this is not really possible as of MongoDB 2.6. mongoimport is a very simple import program that's not designed to have all the features to support complex import use cases. I recommend that you write your own script to construct the proper documents and insert them into the database.
There's also an open issue about this capability, SERVER-3691, that you might want to comment on, watch, or upvote.

What mongodb command I should run to know server info?

For example, I want to know the database directory that is used by mongodb to run what it is?
The documentation said that the default data is in /data/db however, there is no such directory
I wonder if there is a mongodb command to get that simple info. I look through the web and could not find it.
You can see the Configuration Parameters used with:
> db.serverCmdLineOpts()
For example:
{
"argv" : [
"mongod",
"--dbpath",
"/usr/local/db"
],
"parsed" : {
"dbpath" : "/usr/local/db"
},
"ok" : 1
}
Any parameters not specifically listed will be using their default values.
Just create a folder called data anywhere in your system.
Then let Mongo know where the this folder is by updating the path.
in windows you would run this on the command line.
C:\mongodb\bin\mongod.exe --dbpath c:\test\mongodb\data
This is where your mongo stores your data.

mongoimport dosent work well on big files

I have a json file of around 4M json lines, I tried to use:
mongoimport --db mydb --collection mycoll --file myfile.json
and what happened was weird. I got this error:
2018-12-29T17:00:50.424+0200 connected to: localhost
2018-12-29T17:00:50.483+0200 Failed: error processing document #1428: invalid character 'S' after object key:value pair
2018-12-29T17:00:50.483+0200 imported 1426 documents
so first I went to count this collection in mongo and saw that there are 1000 documents and not 1426 as the above mentioned.
second, I located a json with the 'S' in it, which is just a string that looks like "name" : "Double 'S' Transport" and left only this json in the file, import and it worked well.
does anyone understands why is it happening? my suspicion is that mongoimport dosent work on files that big...
any help would be great :)