I know this question has been answered... using below
print("name,id,email");
db.User.find().forEach(function(user){
print(user.name+","+user._id.valueOf()+","+user.email);
});
But I am facing issue while reading the records whose fields start with number.
Below is the O/P
db.Detail.find({"Comment": /ABCD/,"CreateDt": { "$gte" : ISODate("2015-12-03") }},{'Data.01-WaitQueue.EndTime':1}).limit().pretty()
{
"Data" : {
"01-WaitQueue" : {
"EndTime" : ISODate("2015-12-03T02:39:11Z")
}
},
"_id" : ObjectId("565fab4ea5c75a3c4f000000")
}
When I am using forEach to convert into CSV
db.Detail.find({"Comment": /ABCD/,"CreateDt": { "$gte" : ISODate("2015-12-03") }},{'Data.01-WaitQueue.EndTime':1}).limit().forEach(function(PD) {
print(PD.Data.01-WaitQueue.EndTime +":"+ PD._id);
});
I am getting below error
Fri Dec 4 07:11:38 SyntaxError: missing ) after argument list (shell):1
Can someone please help me in rectifying it?
The exception is thrown because of the print line has syntax error.
Instead of
print(PD.Data.01-WaitQueue.EndTime +":"+ PD._id);
Try this instead:
print(PD.Data['01-WaitQueue'].EndTime +":"+ PD._id);
Related
I have a query as below which is working on mongoDB
I need to execute this query in JMeter to get the objects count. Please note I am using Groovy script to execute the same.
db.getCollection('collectionName').find({ $or: [ {"Service" :"AAA"}, {ServerName : "BBB"} ],
"ConnectionID" : "AAAA445789",
"CDDval" : "AGB"
});
Below is the code i am running in groovy and not giving the expected result like Mongodb query above.it is giving error javax.script.ScriptException: org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
try {
MongoCollection<Document> collection = vars.getObject("collectionName");
Document result = collection.find("uuid" ,"{$or : [
{"Service": "AAA"},
{ServerName : "BBB"}
],
"ConnectionID" : "AAAA445789",
"CDDval" : "AGB"
"}).first;
vars.put("uuid", result.get("uuid").toString());
return "uuid=" + result.get("uuid") + " found";
}
catch (Exception e) {
SampleResult.setSuccessful(false);
SampleResult.setResponseCode("500");
SampleResult.setResponseMessage("Exception: " + e);
}
I have tried the using JMeter Groovy 2.4.13 but not getting correct respose, below is the code
We have MongoDB-collection which we want to import to Elasticsearch (for now as a one-off effort). For this end, we have exported the collection with monogexport. It is a huge JSON file with entries like the following:
{
"RefData" : {
"DebtInstrmAttrbts" : {
"NmnlValPerUnit" : "2000",
"IntrstRate" : {
"Fxd" : "3.1415"
},
"MtrtyDt" : "2020-01-01",
"TtlIssdNmnlAmt" : "200000000",
"DebtSnrty" : "SNDB"
},
"TradgVnRltdAttrbts" : {
"IssrReq" : "false",
"Id" : "BMTF",
"FrstTradDt" : "2019-04-01T12:34:56.789"
},
"TechAttrbts" : {
"PblctnPrd" : {
"FrDt" : "2019-04-04"
},
"RlvntCmptntAuthrty" : "GB"
},
"FinInstrmGnlAttrbts" : {
"ClssfctnTp" : "DBFNXX",
"ShrtNm" : "AVGO 3.625 10/16/24 c24 (URegS)",
"FullNm" : "AVGO 3 5/8 10/15/24 BOND",
"NtnlCcy" : "USD",
"Id" : "USU1109MAXXX",
"CmmdtyDerivInd" : "false"
},
"Issr" : "549300WV6GIDOZJTVXXX"
}
We are using the following Logstash configuration file to import this data set into Elasticsearch:
input {
file {
path => "/home/elastic/FIRDS.json"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => json
}
}
filter {
mutate {
remove_field => [ "_id", "path", "host" ]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "firds"
}
}
All this works fine, the data ends up in the index firds of Elasticsearch, and a GET /firds/_search returns all the entries within the _source field.
We understand that this field is not indexed and thus is not searchable, which we are actually after. We want make all of the entries within the original nested JSON searchable in Elasticsearch.
We assume that we have to adjust the filter {} part of our Logstash configuration, but how? For consistency reasons, it would not be bad to keep the original nested JSON structure, but that is not a must. Flattening would also be an option, so that e.g.
"RefData" : {
"DebtInstrmAttrbts" : {
"NmnlValPerUnit" : "2000" ...
becomes a single key-value pair "RefData.DebtInstrmAttrbts.NmnlValPerUnit" : "2000".
It would be great if we could do that immediately with Logstash, without using an additional Python script operating on the JSON file we exported from MongoDB.
EDIT: Workaround
Our current work-around is to (1) dump the MongoDB database to dump.json and then (2) flatten it with jq using the following expression, and finally (3) manually import it into Elastic
ad (2): This is the flattening step:
jq '. as $in | reduce leaf_paths as $path ({}; . + { ($path | join(".")): $in | getpath($path) }) | del(."_id.$oid") '
-c dump.json > flattened.json
References
Walker Rowe: ElasticSearch Nested Queries: How to Search for Embedded Documents
ElasticSearch search in document and in dynamic nested document
Mapping for Nested JSON document in Elasticsearch
Logstash - import nested JSON into Elasticsearch
Remark for the curious: The shown JSON is a (modified) entry from the Financial Instruments Reference Database System (FIRDS), available from the European Securities and Markets Authority (ESMA) who is an European financial regulatory agency overseeing the capital markets.
My below code is producing an error. If the groupBy is removed it works fine. But I only need to get distinct values for common_id. How can i solve this issue?
MasterAffiliateProductMappingMongo::select('_id', 'our_product_id')
->where('top_deal', '=', 'true')
->orderBy('srp', 'asc')
->groupBy('common_id')
->get();
Error: [MongoDB\Driver\Exception\RuntimeException] Unrecognized
expression '$last'
MongoDocument Example:
{
"_id" : ObjectId("5911af8209ed4456d069b1d1"),
"product_id" : "MOBDRYWXFKNPZVG6",
"our_product_id" : "5948e0dca6bc725adb35af2e",
"mrp" : 0.0,
"srp" : 500.0,
"ID" : "5911af8209ed4456d069b1d1",
"common_id" : ObjectId("5911af8209ed4456d069b1d1"),
"top_deal" : "true"
}
Error Log:
[2017-06-28 12:19:46] lumen.ERROR: exception
'MongoDB\Driver\Exception\RuntimeException' with message 'Unrecognized
expression '$last'' in
/var/www/html/PaymetryService4/vendor/mongodb/mongodb/src/Operation/Aggregate.php:219
Refer to this link https://github.com/jenssegers/laravel-mongodb/issues/1185#issuecomment-321267144 it works. We should remove '_id' from the select field.
Question: How to use mongoimport.exe utility to import UTF8 json documents with Cyrillic chars into mongodb database under MS Windows 10? (mongoimport.exe online docs do not provide any information or I'm missing it?)
Environment: MS Windows 10, MongoDB 3.2.3, mongoimport.exe
Source UTF8 json file:
{
"PersonnelNumber": "15128",
"OrderNumber": "765-01",
"OrderDate": "2011-05-04T00:00:00",
"JobPosition": "Слесарь по ремонту подвижного состава"
}
Command line:
"C:\Program Files\MongoDB\Server\3.2\bin\mongoimport.exe" --db testOrders --collection orders --drop < "TestOrder.json"
Import log:
2016-03-01T14:47:04.350+0300 connected to: localhost
dropping: testOrders.orders
Failed: error processing document #1: invalid character 'ï' looking for beginning of value
imported 0 documents
Note
ANSI version of the test file is getting imported without errors but Cyrillic char codes are stored in mongoDb database incorrectly, here is how they are displayed in mongoDb shell:
MongoDB shell version: 3.2.3
connecting to: testOrders
> db.orders.findOne()
{
"_id" : ObjectId("56d5838aef35e4f7c03e81bd"),
"PersonnelNumber" : "15128",
"OrderNumber" : "765-01",
"OrderDate" : "2011-05-04T00:00:00",
"JobPosition" : "������� �� ������� ���������� �������"
}
>
Here is a screenshot with hex codes of UTF8 and ANSI files:
When using C# driver (VS2015) to list imported by mongoimport.exe utility json file
foreach (var order in
(new MongoClient()).GetDatabase("testOrders")
.GetCollection<BsonDocument>("orders").FindSync(new BsonDocument()).ToList())
System.Console.WriteLine("{0}", order.ToString());
the test output also has wrong Cyrillic chars:
{
"_id" : ObjectId("56d5838aef35e4f7c03e81bd"),
"PersonnelNumber" : "15128",
"OrderNumber" : "765-01",
"OrderDate" : "2011-05-04T00:00:00",
"JobPosition" : "������� �� ������� ���������� �������"
}
When using C# driver (VS2015) to insert BsonDocument from C# and then to list imported by mongoimport.exe utility json file/document and the BsonDocument document inserted via C# code
var collection = (new MongoClient()).GetDatabase("testOrders")
.GetCollection<BsonDocument>("orders");
var document = new BsonDocument
{
{"PersonnelNumber", "15128" },
{"OrderNumber", "765-01"},
{"OrderDate", "2011-05-04T00:00:00"},
{"JobPosition", "Слесарь по ремонту подвижного состава"}
};
collection.InsertOne(document);
foreach (var order in collection.FindSync(new BsonDocument()).ToList())
System.Console.WriteLine("{0}", order.ToString());
the former has incorrect Cyrillic chars and the latter has correct Cyrillic chars:
{
"_id" : ObjectId("56d5838aef35e4f7c03e81bd"),
"PersonnelNumber" : "15128",
"OrderNumber" : "765-01",
"OrderDate" : "2011-05-04T00:00:00",
"JobPosition" : "������� �� ������� ���������� �������"
}
{
"_id" : ObjectId("56d58c5c1ed24820b80b80f6"),
"PersonnelNumber" : "15128",
"OrderNumber" : "765-01",
"OrderDate" : "2011-05-04T00:00:00",
"JobPosition" : "Слесарь по ремонту подвижного состава"
}
db.runCommand({addshard:"localhost:10000"});
{ "ok" : 0, "errmsg" : "host already used" }
db.runCommand( { addshard : "localhost:10001" } );
{ "ok" : 0, "errmsg" : "host already used" }
how can i solve that problem? it is "host already used" error
please give me a tips to solve this~
According to mongodb source code this message say that you've already added this specified host:port as a shard:
// check whether this host:port is not an already a known shard
BSONObj old = conn->findOne( ShardNS::shard , BSON( "host" << host ) );
if ( ! old.isEmpty() ){
*errMsg = "host already used";
conn.done();
return false;
}
You can use listShards command to see your current shards:
db.runCommand( { listshards : 1 } );