I am new to MongoDB. After installing MongoDB in Windows I am trying to insert a simple json file using the following command:
C:\>mongodb\bin\mongoimport --db test --collection docs < example2.json
I am getting the following error:
connected to: 127.0.0.1
Fri Oct 18 09:05:43.749 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Field name expected: offset:43
Fri Oct 18 09:05:43.750
Fri Oct 18 09:05:43.750 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0
Fri Oct 18 09:05:43.751
Fri Oct 18 09:05:43.751 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Field name expected: offset:42
Fri Oct 18 09:05:43.751
Fri Oct 18 09:05:43.751 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0
Fri Oct 18 09:05:43.751
Fri Oct 18 09:05:43.752 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Field name expected: offset:44
Fri Oct 18 09:05:43.752
Fri Oct 18 09:05:43.752 exception:BSON representation of supplied JSON is too large: code FailedToParse: FailedToParse: Expecting '{': offset:0
Fri Oct 18 09:05:43.752
Fri Oct 18 09:05:43.752 check 0 0
Fri Oct 18 09:05:43.752 imported 0 objects
Fri Oct 18 09:05:43.752 ERROR: encountered 6 error(s)s
example2.json
{"FirstName": "Bruce", "LastName": "Wayne",
"Email": "bwayne#Wayneenterprises.com"}
{"FirstName": "Lucius", "LastName": "Fox",
"Email": "lfox#Wayneenterprises.com"}
{"FirstName": "Dick", "LastName": "Grayson",
"Email": "dgrayson#Wayneenterprises.com"}
What do I need to do to import new json file into mongodb?
Use
mongoimport --jsonArray --db test --collection docs --file example2.json
Its probably messing up because of the newline characters.
Below command worked for me
mongoimport --db test --collection docs --file example2.json
when i removed the extra newline character before Email attribute in each of the documents.
example2.json
{"FirstName": "Bruce", "LastName": "Wayne", "Email": "bwayne#Wayneenterprises.com"}
{"FirstName": "Lucius", "LastName": "Fox", "Email": "lfox#Wayneenterprises.com"}
{"FirstName": "Dick", "LastName": "Grayson", "Email": "dgrayson#Wayneenterprises.com"}
This worked for me - ( from mongo shell )
var file = cat('./new.json'); # file name
use testdb # db name
var o = JSON.parse(file); # convert string to JSON
db.forms.insert(o) # collection name
Use below command while importing JSON file
C:\>mongodb\bin\mongoimport --jsonArray -d test -c docs --file example2.json
the following two ways work well:
C:\>mongodb\bin\mongoimport --jsonArray -d test -c docs --file example2.json
C:\>mongodb\bin\mongoimport --jsonArray -d test -c docs < example2.json
if the collections are under a specific user, you can use -u -p --authenticationDatabase
This solution is applicable for Windows machine.
MongoDB needs data directory to store data in. Default path is C:\data\db. In case you don't have the data directory, create one in your C: drive, unless different VolumeName is used e.g. H: (or any other relevant VolumeName) which is the root of your machine;
Place the .json file you want to import within: C:\data\db\ .
Before running the command copy-paste mongoimport.exe from C:\Program Files\MongoDB\Tools\100\bin (default path for mongoimport.exe) to the directory of the C:\Program Files\MongoDB\Server\[your_server_version]\bin
Open the command prompt from within C:\data\db\ and type the following command by supporting the specific databasName, collectionName and fileName.json you wish to import :
mongoimport --db databaseName --collection collectionName --file fileName.json --type json --batchSize 1
Hereby,
batchSize can be any integer as per your wish
mongoimport --jsonArray -d DatabaseN -c collectionName /filePath/filename.json
Open command prompt separately
and check:
C:\mongodb\bin\mongoimport --db db_name --collection collection_name< filename.json
In MS Windows, the mongoimport command has to be run in a normal Windows command prompt, not from the mongodb command prompt.
It happened to me couple of weeks back. The version of mongoimport was too old. Once i Updated to latest version it ran successfully and imported all documents.
Reference: http://docs.mongodb.org/master/tutorial/install-mongodb-on-ubuntu/?_ga=1.11365492.1588529687.1434379875
In MongoDB To insert Json array data from file(from particular location from a system / pc) using mongo shell command. While executing below command, command should be in single line.
var file = cat('I:/data/db/card_type_authorization.json'); var o = JSON.parse(file); db.CARD_TYPE_AUTHORIZATION.insert(o);
JSON File: card_type_authorization.json
[{
"code": "visa",
"position": 1,
"description": "Visa",
"isVertualCard": false,
"comments": ""
},{
"code": "mastercard",
"position": 2,
"description": "Mastercard",
"isVertualCard": false,
"comments": ""
}]
It works with JS and Node
Preconditions:
Node
Mongo - either local installed or via Atlas
server.js:
var MongoClient = require('mongodb').MongoClient;
var fs = require('fs')
export function insert(coll) {
MongoClient.connect('uri', (err, db) => {
var myobj = fs.readFileSync("shop.json").toString()
myobj = JSON.parse(myobj)
db.db(dbWeb).collection(coll).insertMany(myobj, (err, res) => {
db.close();
});
});
}
shop.json:
[
{
"doc": "jacke_bb",
"link": "http://ebay.us/NDMJn9?cmpnId=5338273189",
},
{
"doc": "schals",
"link": "https://www.ebay-kleinanzeigen.de/s-anzeige/4-leichte-schals-fuer-den-sommer/2082511689-156-7597",
}
]
As one see, the json starts with [ and ends with ] and the insertMany is used. This leads to a correct nested insertion of the array into the collection.
Related
Trying to import the following csv:
_id,receiver,month,accrualMonth,paymentData.bankCode,operation
573378aef3af68090023da7d,547517955021020200599440,2016-05,2016-04,41,Manual
When I run in mongo shell mongo version 3.4.5
mongoimport --db mean-dev --mode=merge --collection fulfilledpayments --type csv --headerline --file ~/Downloads/\Import.csv -vvvv
it returns the following log but it doesn't really import:
2018-04-04T20:51:25.331-0300 using upsert fields: [_id]
2018-04-04T20:51:25.332-0300 using 0 decoding workers
2018-04-04T20:51:25.332-0300 using 1 insert workers
2018-04-04T20:51:25.332-0300 will listen for SIGTERM, SIGINT, and SIGKILL
2018-04-04T20:51:25.360-0300 filesize: 139 bytes
2018-04-04T20:51:25.361-0300 using fields: _id,receiver,month,accrualMonth,paymentData.bankCode,operation
2018-04-04T20:51:25.381-0300 connected to: localhost
2018-04-04T20:51:25.381-0300 ns: mean-dev.fulfilledpayments
2018-04-04T20:51:25.381-0300 connected to node type: standalone
2018-04-04T20:51:25.381-0300 standalone server: setting write concern w to 1
2018-04-04T20:51:25.381-0300 using write concern: w='1', j=false, fsync=false, wtimeout=0
2018-04-04T20:51:25.381-0300 standalone server: setting write concern w to 1
2018-04-04T20:51:25.381-0300 using write concern: w='1', j=false, fsync=false, wtimeout=0
2018-04-04T20:51:25.382-0300 got line: [573378aef3af68090023da7d 547517955021020200599440 2016-05 2016-04 41 Manual]
2018-04-04T20:51:25.384-0300 imported 1 document
But nothing is really imported into the database, which remains untouched like this:
{
"_id" : ObjectId("573378aef3af68090023da7d"),
"creator" : "547517955021020200599440",
"amountTransferred" : 101.79,
"externalId" : "61fa09",
"date" : ISODate("2016-05-06T16:00:00.000-03:00"),
"payments" : [
ObjectId("559363f127c09e0900b679dd"),
ObjectId("55bc4c9170b99e090093e2a8"),
ObjectId("55e5175a3b2a8e090040d4cd"),
ObjectId("560cab8bad3c6a0900275f5a"),
ObjectId("563cc8d3f2db060900a8ba81"),
ObjectId("5661033e57d24c090035b191"),
ObjectId("568eac27eaa71c090074d5b0"),
ObjectId("56b2e691ced93a0900408267"),
ObjectId("56d905cb4c830809007e8355"),
ObjectId("56fee8063cdd4d0900776fa9"),
ObjectId("5732732e5d237d09008c57e2")
],
"__v" : 0
}
If I get the --collection fulfilledpayments parameter off it imports to a new collection, but of course there in no need for the merge mode there because it doesn't contain the _id to be matched.
Maybe you should enclose your _id within ObjectId, like:
_id,receiver,month,accrualMonth,paymentData.bankCode,operation
ObjectID(573378aef3af68090023da7d),547517955021020200599440,2016-05,2016-04,41,Manual
https://docs.mongodb.com/manual/reference/program/mongoimport/#ex-mongoimport-merge
I am querying an already populated mlab MongoDB database, and I want to store the resulting multiple documents in one single CSV file.
EDIT: output format of CSV file I hope to get:
uniqueid status date
191b117fcf5c 0 2017-03-01 15:26:28.217000
191b117fcf5c 1 2017-03-01 18:26:28.217000
MongoDB database document format is
{
"_id": {
"$oid": "58b6bcc00bd666355805a3ee"
},
"sensordata": {
"operation": "chgstatus",
"user": {
"status": "1",
"uniqueid": "191b117fcf5c"
}
},
"created_date": {
"date": "2017-03-01 17:51:17.216000"
}
}
Database name:mparking_sensor
collection name: demo
The python code to query is as follows:
# -*- coding: utf-8 -*-
"""
Created on Wed Mar 01 18:55:18 2017
#author: Being_Rohit
"""
import pymongo
uri = 'mongodb://#####:*****#ds157529.mlab.com:57529/mparking_sensor'
client = pymongo.MongoClient(uri)
db = client.get_default_database().demo
print db
results = db.find()
f = open("mytest.csv", "w")
for record in results:
query1 = (record["sensordata"]["user"],record["created_date"])
print query1
print "done"
client.close()
EDIT: output format of query1 I am getting is:
({u'status': u'0', u'uniqueid': u'191b117fcf5c'}, {u'date': u'2017-03-01 17:51:08.263000'})
Does someone know the correct way to dump this data in a .csv file (pandas/or any other means) or some other approach for further prediction based analysis to do on it in future like linear regression?
Mongoexport will do the job for you. It can, uniquely among native MongoDB tools, export in CSV format, limited to a specific set of fields.
Your mongoexport command would be something like this:
mongoexport.exe \
--db mparking_sensor \
--collection demo \
--type=csv \
--fields sensordata.user.uniqueid,sensordata.user.status,created_date
That will export something like the following:
sensordata.user.uniqueid,sensordata.user.status,created_date
191b117fcf5c,0,2017-03-01T15:26:28.217000Z
191b117fcf5c,1,2017-03-01T18:26:28.217000Z
I was trying to export a collection to csv using mlabs 'export collection' they make it harder than it needs to be. So i used https://studio3t.com and connected using the standard MongoDB URI
I need to pull the latest date from a collection on mongo db and set it in a shell script.
LASTDOCDATE=mongo mongo.com:27017/tracking -u user -p pw --authenticationDatabase authdb --eval 'db.TRACKING_DATA.find({},{datecreated :1}).limit(1).sort({datecreated:-1}).map(function (doc){ return doc.datecreated; })'
echo $LASTDOCDATE
This to be set but when run through the terminal produces:
connecting to: mongo.com:27017/tracking
Mon Jul 27 2015 16:28:08 GMT-0700 (PDT)
have can I pull just the date attribute to be set in a shell script as a variable
Wrap your call with the printjson() method of the shell in order to get an output string:
LASTDOCDATE=mongo mongo.com:27017/tracking -u user -p pw \\
--authenticationDatabase authdb \\
--eval 'printjson(db.TRACKING_DATA.find({},{datecreated :1}).limit(1).sort({datecreated:-1}).map(function (doc){ return doc.datecreated; }))'
Or just print, while referencing the single element:
LASTDOCDATE=mongo mongo.com:27017/tracking -u user -p pw \\
--authenticationDatabase authdb \\
--eval 'print(db.TRACKING_DATA.find({},{datecreated :1}).limit(1).sort({datecreated:-1}).toArray()[0].datecreated'
Notating the single array element, and then the property:
.find({},{datecreated :1}).limit(1).sort({datecreated:-1}).toArray()[0].datecreated'
Or findOne() like this with $orderby:
.findOne(
{ "query": {}, "$orderby": { "datecreated": 1 }},
{ "_id": 0, "datecreated": 1 }
).datecreated
So .print() or .printjson() depending on the output format you want. Or even .valueOf() on the "datecreated" to just get the timestamp value rather than the string.
mongodump command:
mongodump --host myhost.com --port 12345 --username myUsername --password PSWRD --out /opt/somepath --db myDb --collection my_collection --query "{ content_type_id: { \$not: { \$eq: db.my_type.findOne({slug: 'form_submissions'} )._id } } }" --verbose
Results in:
assertion: 16619 code FailedToParse: FailedToParse: Bad characters in value:
offset:33 of:{ content_type_id: { $not: { $eq: db.my_type.findOne({slug: 'form_submissions'} )._id } } }
That's not a valid query. --query must be a JSON document. Your error is in thinking that mongodump is something programmatic like the mongo shell that can evaluate the findOne and substitute the value into the query. This is not the case. You can find the _id from the result of the findOne and put it in the mongodump --query manually. Use extended JSON format for an ObjectId type, if that is the type of _id.
I'm trying to follow this tutorial: http://www.ultrabug.fr/tag/mongoexport/
and use a sh file for the query line.
this is my file:
#!/bin/bash
d=`date --date="-3 month"`
echo "{ timeCreated: { "\$lte": $d} }"
this is my mongoexport line:
mongoexport --db game_server --collection GameHistory -query /home/dev/test2.sh --out /home/dev/file.json
I keep getting:
assertion: 16619 code FailedToParse: FailedToParse: Expecting '{': offset:0 of:/home/dev/test2.sh
why? How can I make this work?
I found several errors in your approach, let's examine them one by one.
Date format
MongoDB expects date to be a number or ISO 8601 string.
Unfortunately, unix date utility have no build-in support for this format, so you should use:
d=`date --date="-3 month" -u +"%Y-%m-%dT%H:%M:%SZ"`
Using extended JSON
JSON specification have no support for dates, so you should use MongoDB extended JSON. So, your final query should look like this:
{ "timeCreated": { "$lte": { "$date": "2014-05-12T08:53:29Z" } } }
test.sh output
You messed up with quotation marks. Here is a script example, outputting correct JSON:
#!/bin/bash
d=`date --date="-3 month" -u +"%Y-%m-%dT%H:%M:%SZ"`
echo '{ "timeCreated": { "$lte": { "$date": "'$d'" } } }'
Passing query to mongoexport
mongoexport expects --query to be a JSON string, not .sh script. So, when you're passing file path to --query, mongoexport expects it to be a JSON file.
To fix it you should execute test2.sh yourself and pass resulting string to mongoexport:
mongoexport --db game_server --collection GameHistory \
--query "`./test2.sh`" --out ./test2.json
N.B. Notice " quotation marks around ./test2.sh call. They're telling bash to treat ./test2.sh output as a single parameter, ignoring all inner quotation marks and whitespaces.
You need to add back ticks around a script or command to have it evaluated:
mongoexport --db game_server --collection GameHistory \
-query `/home/dev/test2.sh` --out /home/dev/file.json