How to update ISODate to two days later - mongodb

How to convert a SQL query to mongodb query?
Please write a mongodb query - this is my query in SQL:
UPDATE user
SET expireIn = DATEADD(DAY, 2, expireIn)
WHERE phone = '123434574'
I want to add some day to expireIn column.
expireIn field is ISODate and also has a value for the time.

Welcome Mostafa Asadi,
You can do something like this:
db.collection.update({
phone: "123434574"
},
[
{
$set: {
"expireIn": {
$dateAdd: {
startDate: "$expireIn",
unit: "day",
amount: 2
}
}
}
}
],{multi:true})
As you can see on the playground.
The first {} are the matching part, which documents do you want to update. The second part is the updating, here inside [] as this is a pipeline, using the $dateAdd function.
Edit:
with {multi: true} for multiple documents update

Related

Compute Simple Moving Average in Mongo Shell

I am developing a financial application with Nodejs. I wonder would it be possible to compute simple moving average which is the average last N days of price directly in Mongo Shell than reading it and computing it in Node js.
Document Sample.
[{code:'0001',price:0.10,date:'2014-07-04T00:00:00.000Z'},
{code:'0001',price:0.12,date:'2014-07-05T00:00:00.000Z'},{code:'0001',price:0.13,date:'2014-07-06T00:00:00.000Z'},
{code:'0001',price:0.12,date:'2014-07-07T00:00:00.000Z'}]
If you have more than a trivial number of documents you should use the DB server to do the work rather than JS.
You don't say if you are using mongoose or the node driver directly. I'll assume you are using mongoose as that is the way most people are headed.
So your model would be:
// models/stocks.js
const mongoose = require("mongoose");
const conn = mongoose.createConnection('mongodb://localhost/stocksdb');
const StockSchema = new mongoose.Schema(
{
price: Number,
code: String,
date: Date,
},
{ timestamps: true }
);
module.exports = conn.model("Stock", StockSchema, "stocks");
You rightly suggested that aggregation frameworks would be a good way to go here. First though if we are dealing with returning values between date ranges, the records in your database need to be date objects. From your example documents you may have put strings. An example of inserting objects with dates would be:
db.stocks.insertMany([{code:'0001',price:0.10,date:ISODate('2014-07-04T00:00:00.000Z')}, {code:'0001',price:0.12,date:ISODate('2014-07-05T00:00:00.000Z')},{code:'0001',price:0.13,date:ISODate('2014-07-06T00:00:00.000Z')}, {code:'0001',price:0.12,date:ISODate('2014-07-07T00:00:00.000Z')}])
The aggregation pipeline function accepts an array with one or more pipeline stages.
The first pipeline stage we should use is $match, $match docs, this filters the documents down to only the records we are interested in which is important for performance
{ $match: {
date: {
$gte: new Date('2014-07-03'),
$lte: new Date('2014-07-07')
}
}
}
This stage will send only the documents that are on the 3rd to 7th July 2014 inclusive to the next stage (in this case all the example docs)
Next stage is the stage where you can get an average. We need to group the values together based on one field, multiple fields or all fields.
As you don't specify a field you want to average over I'll give an example for all fields. For this we use the $group object, $group docs
{
$group: {
_id: null,
average: {
$avg: '$price'
}
}
}
This will take all the documents and display an average of all the prices.
In the case of your example documents this results in
{ _id: null, avg: 0.1175 }
Check the answer:
(0.10 + 0.12 + 0.12 + 0.13) / 4 = 0.1175
FYI: I wouldn't rely on calculations done with javascript for anything critical as Numbers using floating points. See https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html for more details if you are worried about that.
For completeness here is the full aggregation query
const Stock = require("./models/stocks");
Stock.aggregate([{ $match: {
date: {
$gte: new Date('2014-07-03'),
$lte: new Date('2014-07-07')
}
}},
{
$group: {
_id: null,
avg: {
$avg: '$price'
}
}
}])
.then(console.log)
.catch(error => console.error(error))
Not sure about your moving average formula, but here is how I would do it:
var moving_average = null
db.test.find().forEach(function(doc) {
if (moving_average==null) {
moving_average = doc.price;
}
else {
moving_average = (moving_average+doc.price)/2;
}
})
output:
> moving_average
0.3
And if you wan to define the N days to do the average for, just modify the argument for find:
db.test.find({ "date": { $lt: "2014-07-10T00:00:00.000Z" }, "date": { $gt: "2014-07-07T00:00:00.000Z" } })
And if you want to do the above shell code in one-line, you can assume that moving_average is undefined and just check for that before assigning the first value.

Mongodb, get registers from last registered day

I have a collection with many documents structured:
{
"_id" : "skkbbp8TgnT3a2XgT",
// ... other fields
"createdAt" : ISODate("2015-12-08T21:03:37.141Z")
}
How can I find all the documents from the last registered day? And from the last five registered days?
Is it possible to do with only one statement?
Edit: Not duplicated. My question is to get all data from the last registered day, not how to query with dates.
So you need to pull the last registered date, chop off the time portion, and then compose a $gte query using just the date. Using momentjs makes it pretty simple. Assuming your collection name is called "Foo":
var lastDate = Foo.findOne({}, {
sort: {
createdAt: -1
}
}).createdAt;
var startOf = moment(lastDate).startOf("day").toDate();
Foo.find({
createdAt: {
$gte: startOf
}
});
So you can do it in one line, but not recommended :)
Foo.find({
createdAt: {
$gte: moment(Foo.findOne({},{sort: {createdAt: -1}}).createdAt).startOf('day').toDate()
}
});

Publish all fields in document but just part of an array in the document

I have a mongo collection in which the documents have a field that is an array. I want to be able to publish everything in the documents except for the elements in the array that were created more than a day ago. I suspect the answer will be somewhat similar to this question.
Meteor publication: Hiding certain fields in an array document field?
Instead of limiting fields in the array, I just want to limit the elements in the array being published.
Thanks in advance for any responses!
EDIT
Here is an example document:
{
_id: 123456,
name: "Unit 1",
createdAt: (datetime object),
settings: *some stuff*,
packets: [
{
_id: 32412312,
temperature: 70,
createdAt: *datetime object from today*
},
{
_id: 32412312,
temperature: 70,
createdAt: *datetime from yesterday*
}
]
}
I want to get everything in this document except for the part of the array that was created more than 24 hours ago. I know I can accomplish this by moving the packets into their own collection and tying them together with keys as in a relational database but if what I am asking were possible, this would be simpler with less code.
You could do something like this in your publish method:
Meteor.publish("pubName", function() {
var collection = Collection.find().fetch(); //change this to return your data
_.each(collection, function(collectionItem) {
_.each(collectionItem.packets, function(packet, index) {
var deadline = Date.now() - 86400000 //should equal 24 hrs ago
if (packet.createdAt < deadline) {
collectionItem.packets.splice(index, 1);
}
}
}
return collection;
}
Though you might be better off storing the last 24 hours worth of packets as a separate array in your document. Would probably be less taxing on the server, not sure.
Also, code above is untested. Good luck.
you can use the $elemMatch projection
http://docs.mongodb.org/manual/reference/operator/projection/elemMatch/
So in your case, it would be
var today = new Date();
var yesterday = new Date(today);
yesterday.setDate(today.getDate() - 1);
collection.find({}, //find anything or specifc
{
fields: {
'packets': {
$elemMatch: {$gt : {'createdAt' : yesterday /* or some new Date() */}}
}
}
});
However, $elemMatch only returns the FIRST element matching your condition. To return more than 1 element, you need to use the aggregation framework, which will be more efficient than _.each or forEach, particularly if you have a large array to loop through.
collection.rawCollection().aggregate([
{
$match: {}
},
{
$redact: {
$cond: {
if : {$or: [{$gt: ["$createdAt",yesterday]},"$packets"]},
then: "$$DESCEND",
else: "$$PRUNE"
}
}
}], function (error, result ){
});
You specify the $match in a way similar to find({}). Then all the documents that match your conditions get pipped into the $redact which is specified by the $cond.
$redact scans the document from top level to bottom. At the top level, you have _id, name, createdAt, settings, packets; hence {$or: [***,"$packets"]}
The presence of $packets in the $or allows the $redact to scan the second level which contain the _id, temperature and createdAt; hence {$gt: ["$createdAt",yesterday]}
This is async, you can use Meteor.wrapAsync to wrap around the function.
Hope this help

Aggregate MongoDB results by ObjectId date

How can I aggregate my MongoDB results by ObjectId date. Example:
Default cursor results:
cursor = [
{'_id': ObjectId('5220b974a61ad0000746c0d0'),'content': 'Foo'},
{'_id': ObjectId('521f541d4ce02a000752763a'),'content': 'Bar'},
{'_id': ObjectId('521ef350d24a9b00077090a5'),'content': 'Baz'},
]
Projected results:
projected_cursor = [
{'2013-09-08':
{'_id': ObjectId('5220b974a61ad0000746c0d0'),'content': 'Foo'},
{'_id': ObjectId('521f541d4ce02a000752763a'),'content': 'Bar'}
},
{'2013-09-07':
{'_id': ObjectId('521ef350d24a9b00077090a5'),'content': 'Baz'}
}
]
This is what I'm currently using in PyMongo to achieve these results, but it's messy and I'd like to see how I can do it using MongoDB's aggregation framework (or even MapReduce):
cursor = db.find({}, limit=10).sort("_id", pymongo.DESCENDING)
messages = [x for x in cursor]
this_date = lambda x: x['_id'].generation_time.date()
dates = set([this_date(message) for message in messages])
dates_dict = {date: [m for m in messages if this_date(m) == date] for date in dates}
And yes, I know that the easiest way would be to simply add a new date field to each record then aggregate by that, but that's not what I want to do right now.
Thanks!
Update: There is a built in way to do this now, see https://stackoverflow.com/a/51766657/295687
There is no way to accomplish what you're asking with mongodb's
aggregation framework, because there is no aggregation operator that
can turn ObjectId's into something date-like (there is a JIRA
ticket, though). You
should be able to accomplish what you want using map-reduce, however:
// map function
function domap() {
// turn ObjectId --> ISODate
var date = this._id.getTimestamp();
// format the date however you want
var year = date.getFullYear();
var month = date.getMonth();
var day = date.getDate();
// yields date string as key, entire document as value
emit(year+"-"+month+"-"+day, this);
}
// reduce function
function doreduce(datestring, docs) {
return {"date":datestring, "docs":docs};
}
The Jira Ticket pointed out by llovett has been solved, so now you can use date operators like $isoWeek and $year to extract this information from an ObjectId.
Your aggregation would look something like this:
{
"$project":
{
"_id": {
"$dateFromParts" : {
"year": { "$year": "$_id"},
"month": { "$month": "$_id"},
"day": { "$dayOfMonth": "$_id"}
}
}
}
}
So this doesn't answer my question directly, but I did find a better way to replace all that lambda nonsense above using Python's setdefault:
d = {}
for message in messages:
key = message['_id'].generation_time.date()
d.setdefault(key,[]).append(message)
Thanks to #raymondh for the hint in is PyCon talk:
Transforming Code into Beautiful, Idiomatic Python

Mongo add timestamp field from existing date field

I currently have a collection with documents like the following:
{ foo: 'bar', timeCreated: ISODate("2012-06-28T06:51:48.374Z") }
I would now like to add a timestampCreated key to the documents in this collection, to make querying by time easier.
I was able to add the new column with an update and $set operation, and set the timestamp value but I appears to be setting the current timestamp using this:
db.reports.update({}, {
$set : {
timestampCreated : new Timestamp(new Date('$.timeCreated'), 0)
}
}, false, true);
I however have not been able to figure out a way to add this column and set it's value to the timestamp of the existing 'timeCreated' field.
Do a find for all the documents, limiting to just the id and timeCreated fields. Then loop over that and generate the timestampCreated value, and do an update on each.
Use updateMany() which can accept aggregate pipelines (starting from MongoDB 4.2) and thus take advantage of the $toLong operator which converts a Date into the number of milliseconds since the epoch.
Also use the $type query in the update filter to limit only documents with the timeCreated field and of Date type:
db.reports.updateMany(
{ 'timeCreated': {
'$exists': true,
'$type': 9
} },
[
{ '$set': {
'timestampCreated': { '$toLong': '$timeCreated' }
} }
]
)