Why in grid engine the start_time and end_time is coming 0 in the accouting logs for a job? - scheduler

Just for reference,
In job1 the "start_time" is 0, "end_time" is 0, "failed_code" is 26, "exit_status" is 0 in the accounting logs for that job.
In job2 the "start_time" is 0, "end_time" is 0, "failed_code" is 19, "exit_status" is 0 in the accounting logs for that job.
I'm not sure why It's showing 0 in the "start_time" & "end_time".
What can be the possible reasons for this?

Related

i am not getting data till today date in mongodb using $gte

what i want is to get the users who have created there account in between the range of dates and also want to get the customer with order count greater then 1 for those new customers, so what i am doing is trying to get the data greater then the last month i.e 1st of march 2020 but the output is giving me users till 1 april, why not till today i.e 11 april, the data is in following format
"_id" : ObjectId("1dv47sd1a10048521sa1234d"),
"updatedAt" : ISODate("2020-04-01T16:19:26.460+05:30"),
"createdAt" : ISODate("2020-04-01T16:18:46.066+05:30"),
"email" : "edx#gmail.com",
"phone" : "xxxxxxxxxx",
"password" : "$awdad$2b$10$4YaO6AEZqXA9ba0iz14ALi",
"dob" : "00/20/1990",
"stripeID" : "xxxxxxxxxxxxxxxx",
"__t" : "Customer",
"banned" : false,
"picture" : "link to image",
"name" : {
"first" : "ababab",
"last" : "Saaa"
},
"orderCount" : 2,
"creditCards" : [ ],
"addresses" : [ ],
"__v" : 0,
"isEmailVerified" : true
i have written a query for extracting data from the date greater then last month but it is giving me data till the 1 of april, my query is as follows
db.users.find({
"createdAt" : { "$gte" :new Date("2020-03-1") }
})
so i want to get data timm today from 1 march 2020 also order count is greter then 1,thanks in advance i am preety new with mongo
MongoDB only stores absolute times (aka timestamps or datetimes), not dates. As such the queries should generally be specified using times, not dates, also.
Furthermore, MongoDB always stores times in UTC. Different systems can operate in different timezones. For example, the shell may be operating in a different timezone from your application.
To query using dates:
Determine what timezone the dates are in.
Convert dates to times in that timezone. For example, 2020-03-01 00:00:00 UTC - 2020-03-31 23:59:59 UTC and 2020-03-01 00:00:00 -0400 - 2020-03-31 23:59:59 -0400 are both "March" and would match different sets of documents.
Convert the times to UTC. This may be something that a framework or library you are using handles for you, for example Mongoid in Ruby does this.
Use the UTC times to construct MongoDB queries.
This is stated in the documentation here.

Get weekly changes (progress)

I have the following objects in the Games collection:
{
_id : 1,
UserID : 420,
MaxValue : 235.23,
Date : ISODate("2017-08-02T00:23:44.553Z")
},
{
_id : 2,
UserID : 420,
MaxValue : 452.23,
Date : ISODate("2017-08-06T00:23:44.553Z")
},
{
_id : 3,
UserID : 420,
MaxValue : 281.02,
Date : ISODate("2017-08-15T00:23:44.553Z")
}
The question is: how can I get the percentage of MaxValue progress over the week (== weekly progress)? For example, if we take _id: 3 object's Date field and compute 420's user progress, it will look like
Progress over the last week in percentage (current date: Aug 6, 2017) = (452.23 - 235.23) / 235.23 = ~92%
Average progress over the last week (current date: Aug 6, 2017) = (452.23 + 235.23) / 2 = 343.73
Then, for the next week, we take the computed average of the previous week. It will look like:
Progress over the last week (current date: Aug 15, 2017) = (281.02 - 452.23 ) / 343.73 = - 49%
Average progress over the last week (current date: Aug 15, 2017) = ( 281.02 + 343.73 ) / 2 = 321.375
Hence, this is how it looks like if we compute progress manually (by hand).
Therefore, the question is the same, but how to do that using MongoDB aggregation / mapReduce? And, for this specific measurement (progress), should I create a separate table for this measurement (weekly progress), or rather, store them in Games collection?

Insertion of 2016-08-29-23.17.147253 timestamp data in db2 table

How can I insert 2016-08-29-23.17.147253 in db2 timestamp column?
INSERT INTO TEST VALUES (2015,2016-08-29-23.17.147253, 5000 , 0, 0, 0, 0, 0, 0, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP )
CURRENT_TIMESTAMP is working fine, 2016-08-29-23.17.147253 not
Your format of timestamp seems incorrect. It must be "A timestamp is a seven-part value representing a date and time by year, month, day, hour, minute, second, and microsecond". Also, enclose the timestamp value in single quotes in INSERT statement. For more infor check :
http://www.ibm.com/support/knowledgecenter/SSEPEK_10.0.0/intro/src/tpc/db2z_datetimetimestamp.html
Try with quote like this:
INSERT INTO TEST VALUES (2015, '2016-08-29-23.17.147253', 5000 , 0, 0, 0, 0, 0, 0, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP )

Finding document based on date range works in the mongodb shell, but not with pymongo

On the mongo shell this returns a document just fine:
> db.orderbook_log.findOne({'time': { '$gte': new Date(2014, 9, 24, 17, 38, 20, 546000), '$lt': new Date(2014, 10, 24, 17, 39, 20, 546000)}})
//... returns document with this time stamp:
"time" : ISODate("2014-10-25T00:47:30.819Z")
Notice I used "9" for October because JavaScript's months are 0-11.
And I also tested with "23" as the day because it looks like JS days are also 0-indexed, and that also returned a document: "time" : ISODate("2014-10-24T17:32:13.595Z")
atime = datetime.datetime(2014, 10, 24, 17, 38, 20, 546000)
btime = datetime.datetime(2014, 10, 24, 17, 39, 20, 546000)
future_book = log.find_one({"time": {"$gte": atime, "$lt": btime}})
But when I execute find_one in pymongofuture_book is None
All I'm really trying to do is loop though the first 100 records or so and get a record that occurred a relative minute later.
Javascript days are not zero-indexed, alas. Only months are.
I see in your Javascript you're adding 546,000 ms to the first date, so that results in 2014-10-24 at 17:48:26. Javascript then converts to your local timezone, so in my case it adds 5 hours:
> new Date(2014, 10, 24, 17, 39, 20, 546000)
ISODate("2014-11-24T22:48:26Z")
This is then compared (ignoring timezones) with the "time" field in your documents.
Better to remove the final milliseconds argument, and use the MongoDB shell's ISODate function, which is designed to be more rational than Javascript Dates:
> ISODate('2014-10-24T17:38:20')
ISODate("2014-10-24T17:38:20Z")
That will then compare to your documents in the way you expect, and it should match the PyMongo behavior. I suggest you drop the milliseconds argument from your Python datetime constructors, too, to clarify things.

Schema design for accumulated data on mongodb

I'm searching and studing a lot how can i do this... I read many similar use cases, but i need some more complicated.
I will need collect this informations:
** Report Between April 2013 and Sept 2013 **
1- Monthly not accumulated report:
group1: {
"04" : {"photos" : 1, "documents" : 2, "likes" : 0},
...
"09" : {"photos" : 0, "documents" : 3, "likes" : 10} }, group2: ...
My group1 had 1 photo, 2 documents and 0 likes this mounth(4).
My group2 ...
2- Monthly accumulated report:
group1: {
"04" : {"photos" : 10, "documents" : 20, "likes" : 500},
...
"09" : {"photos" : 100, "documents" : 200, "likes" : 3000}
},
group2: ...
My group1 has at total 10 photos, 20 documents and 500 likes since the group was created until end range of date search.
And i will have the daily not accumulated report and daily accumulated report too.
The accumulated reports is how much photos, documents and likes my group had this month/day, like a historical report.
The not accumulated reports is how much photos, documents and likes my group got only this month/day.
The many use cases that i saw, it's about page views, analytics like google... but anyone has the accumulated data...
Anyone know a better schema design for my situation?
Thanks,
Diego
I got!
I created 3 colletions:
accumulated_stats: :_id, :photos, :documents
monthly_stats: :_id, date, :counts {:photos, :documents}, accumulated {:photos, :documents}
daily_stats: :_id, date, :counts {:photos, :documents}, accumulated {:photos, :documents}
Now, for each action, i will increment accumulated_stats, get the results and increment daily and monthly stats. it works as well!
But for showing the data.. i won't have all date time.. for instance:
2013-09-10
2013-09-11 ( do not have )
2013-09-12
2013-09-13
I'm looking for a way to construct the perfect date range. For instance, i don't have the day 11 on my collection, how can i bring it with the last existent values? any ideas?
Thanks,
Diego