How do I perform aggregate queries using SumoLogic APIs - rest

I am trying to perform aggregate queries using SumoLogic APIs as mentioned here.
Something like:
_view = <some_view> | where sourceCategory matches \"something\" | sum(field) by sourceCategory
This works just fine in the Sumo GUI. I get a field in result called "_sum" which gives me the desired result.
However the same doesn't work when I do it using the SUMO APIs. If I create a job with this body:
{
"query": "_view = <some_view> | where sourceCategory matches "something" | sum(field) by sourceCategory",
"from": "start_timestamp",
"to": "end_timestamp",
"timeZone": "some_timezone"
}
I call the "v1/search/jobs" POST method with the above body and I do GET "v1/search/jobs/{job_id}" till the state is "DONE GATHERING RESULTS". Then I do "v1/search/jobs/{job_id}/messages". I was expecting to see aggregated values in the result, but instead I see something similar to:
{
"fields":[
{
"name":"_messageid",
"fieldType":"long",
"keyField":false
}, ...
],
"messages":[
{
"map":{
"_receipttime":"1359407350899",
"_size":"549",
"_sourcecategory":"service",
"_sourceid":"1640",
"the_field_i_mentioned":"not-aggregated-value"
"_messagecount":"2044"
}
}, ...
]
]
Thanks for going through my question. Any advices / work-arounds are appreciated. I don't really want to iterate manually through all items and calculate the sum. I'd prefer to do it on SumoLogic side itself. Thanks Again!

Explanation
Similar as in the User Interface, in the API for log searches you get both raw results (also referred to as messages) and the aggregate results (also referred to as records).
(Obviously, the latter are only returned if there's any aggregation in the query. In your case there is.)
Actual suggestion
Then I do "v1/search/jobs/{job_id}/messages"
Try /records instead.
See the docs for "Paging through the records found by a Search Job"
Disclaimer: I am currently employed by Sumo Logic.

Related

Mongodb Stitch realtime watch

What I intend to achieve is some sort of "live query" functionality.
So far I've tried using the "watch" method. According to the documentation:
You can open a stream of changes that match a filter by calling
collection.watch(delegate:) with a $match expression as the argument.
Whenever the watched collection changes and the ChangeEvent matches
the provided $match expression, the stream’s event handler fires with
the ChangeEvent object as its only argument
Passing the doc ids as an array works perfectly, but passing a query doesn't work:
this.stitch.db.collection<Queue>('queues')
.watch({
hospitalId: this.activehospitalid
}));
I've also tried this:
this.stitch.db.collection<Queue>('queues')
.watch({
$match: {
hospitalId: this.activehospitalid
}
},
));
Which throws an error on the console "StitchServiceError: mongodb watch: filter is invalid (unknown top level operator: $match)". The intention is watch all documents where the field "hospitalId" matches the provided value, or to effectively pass a query filter to the watch() method.
After a long search I found that it's possible to filter, but the query needs to be formatted differently
this.stitch.db.collection<Queue>('queues')
.watch({
$or: [
{
"fullDocument.hospitalId": this.activehospitalid
}
]
},
));
For anyone else who might need this, please note the important fullDocument part of the query. I havent found much documentation relating to this, but I hope it helps

Can you list multiple features within the same Schema.org "LocationFeatureSpecification"?

I am working on Schema.org Resort schema for a ton of resorts on a travel website and am trying to find the most efficient ways of filling out the schema with regards to amenities.
The current code looks something like this:
"amenityFeature": [
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Spa",
"value":"true"
},
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Internet Access",
"value":"true"
},
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Tennis Courts",
"value":"true"
}
]
My question is, can I write it like this instead to shorten lines of code:
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":[
"Spa", "Internet Access", "Tennis Courts"
],
"value":"true"
}
When I test it in Google’s Structured Data Testing Tool, it doesn’t give any errors. Here is what it looks like in the SDTT when I write it the short way:
And here is what it looks like if I do it the first/long way:
If I do it the short way, I want to make sure all those items are getting listed as amenities and not just different names for the same amenity. Otherwise, I'll go the long route.
No, each LocationFeatureSpecification represents one feature:
Specifies a location feature by providing a structured value representing a feature of an accommodation as a property-value pair of varying degrees of formality.
Your second snippet would represent one feature with multiple names.

MongoDB: Bulk changing all field types in python

I have a ton of documents (around 10 million) and I need to change their field type. The usual forEach function (just looping through every value) seems to take forever and is clearly not viable in the timeframe I have (it basically took all night for one out of four updates)
I've heard that bulkwrites may be able to do it but I'm getting mixed messages. I saw a confusing answer on this site, for example, says that there's no written function to do it (you would have to do some workaround), others say that it can be done with updates in Python, using pymongo.
I was wondering if there was a quicker way to mass changes of field type (string->double, string -> int) using python? I can also work from the console but I find even less solutions there.
Thanks
You can try using aggregation query in the mongo shell
Something like
db.your_collection.aggregate([
{
$addFields: {
field1: {
$convert: {
input: "$field1",
to: "string"
}
}
}
},
{ $out: "your_collection" }
])
More info here https://docs.mongodb.com/manual/reference/operator/aggregation/convert/

Retrieve last document in a MongoDB using Pymongo and Flask

I'm working on a Raspberry Pi project that collects weather measurements and stores them in a Mongo database like this:
{
"_id": {
"$oid": "577975c874fece5775117209"
},
"timestamp": {
"$date": "2016-07-03T20:30:00.995Z"
},
"temp_f": 68.9,
"temp_c": 20.5,
"humidity": 50,
"pressure": 29.5
}
The data is going into the Mongo db just fine. Next, I'm trying to build a Flask-based dashboard that enables me to look at the recorded data. On one of the pages of the dashboard, I want to show the current recorded values, so what I need to do is pull out the last measurement and pass it to a flask template for rendering to the browser.
I found a post here that said I could use:
data = db.measurements.find().limit(1).sort({"$natural": -1})
but natural doesn't seem to be a valid option for the call to find.
This works:
measurement = mongo.db.measurements.find_one()
It pulls back one random measurement that I can then pass to the flask template, but how do I use sort to get the most recent one?
I tried:
measurement = mongo.db.measurements.find_one().sort([("timestamp", -1)])
but that generates an attribute error: AttributeError: 'dict' object has no attribute 'sort'
I've also tried:
cursor = mongo.db.measurements.find().limit(1).sort({"timestamp": -1})
but that doesn't work either.
I'm missing something here, can someone give me a quick, complete fix for this?
It turns out Pymongo has a different format for sort. You can't pass in a JSON object, you have to use a dict. Once I fixed that, it all works:
cursor = mongo.db.measurements.find().sort([('timestamp', -1)]).limit(1)

Dynamic query in couchbase?

I started using couchbase,
i like it a lot but one thing i cant find,
making a dynamic query,
{
"sender_name": "roman",
"sender_id": 123,
"content": "Hello World"
}
Now i want to query for document where "sender_id" = ?.
It can be any number,
Regular view with doc and meta cant help me because i dont know the value,
I should expect any sender_id.
Hope you can help me, thanks alot.
Ok , with Couchbase you can write Map&Reduce functions, and create views. The map functions accept parameters. I am not quite familiar with writing a Map function, but from couchbase.com I think this map function will do your job.
function(doc, meta)
{
emit(doc.sender_id, [doc.content]);
}
And your query would be ?key=["123"]
Go through these links
http://hardlifeofapo.com/creating-an-e-commerce-platform-using-couchbase-2/
http://hardlifeofapo.com/basic-couchbase-querying-for-sql-people/
http://www.couchbase.com/docs/couchbase-manual-2.0/couchbase-views-writing-sql-where.html