I am trying to insert a birthday with Date type in NoSQL like this
db.employees.insertOne({
"Name": "Alison Davison",
birthday: Date("Apr 05, 1975"),
"Address": "874 W. Oak Place",
"City": "Gary",
"State": "Indiana",
Position:{Name: "Customer Support", Remote: true, Full Time: true}
});
And is throwing this error
uncaught exception: SyntaxError: missing : after property id :
#(shell):7:63
You need to use the supported format in the Date constructor.
new Date("<YYYY-mm-dd>") returns the ISODate with the specified date.
new Date("<YYYY-mm-ddTHH:MM:ss>") specifies the datetime in the client’s local timezone and returns the ISODate with the specified datetime in UTC.
new Date("<YYYY-mm-ddTHH:MM:ssZ>") specifies the datetime in UTC and returns the ISODate with the specified datetime in UTC.
new Date(<integer>) specifies the datetime as milliseconds since the Unix epoch (Jan 1, 1970), and returns the resulting ISODate instance.
The format you tried is not supported default.
Refer
And please use the below. You have a space in your key. Enclose it with quotes.
{
"Name": "Alison Davison",
"birthday": new Date("1975-04-05"),
"Address": "874 W. Oak Place",
"City": "Gary",
"State": "Indiana",
"Position": {
"Name": "Customer Support",
"Remote": true,
"Full Time": true
}
}
Related
I am using google task list api and getting list from server. I created three task with different due time and date. I am getting date for every task but getting same due time. Can you please elaborate why this is happening?
Output:
{
"kind": "tasks#tasks",
"etag": "*********",
"items": [
{
"kind": "tasks#task",
"id": "******",
"etag": "******",
"title": "Task 2",
"updated": "2021-01-29T14:40:36.000Z",
"selfLink": "******",
"position": "00000000000000000001",
"status": "needsAction",
"due": "2021-01-30T00:00:00.000Z"
},
{
"kind": "tasks#task",
"id": "*********",
"etag": "*******",
"title": "Task 4",
"updated": "2021-01-29T13:18:51.000Z",
"selfLink": "*******",
"position": "00000000000000000000",
"status": "needsAction",
"due": "2021-01-30T00:00:00.000Z"
},
{
"kind": "tasks#task",
"id": "***********",
"etag": "*************",
"title": "Task 1",
"updated": "2021-01-29T13:08:39.000Z",
"selfLink": "*******",
"position": "00000000000000000002",
"status": "needsAction",
"due": "2021-01-29T00:00:00.000Z"
}
]
}
Based on the Resource:tasks,
Field: due
Due date of the task (as a RFC 3339 timestamp). Optional. The due date only records date information; the time portion of the timestamp is discarded when setting the due date. It isn't possible to read or write the time that a task is due via the API.
Google api can only read date not time for due field.
This line is from their official documentation Tasks API . tasks
Blockquote
"due": "A String", # Due date of the task (as a RFC 3339 timestamp). Optional. The due date only records date information; the time portion of the timestamp is discarded when setting the due date. It isn't possible to read or write the time that a task is due via the API.
I have the following document:
{
"_id": "5d646d78a6e3fe3834e448e4",
"uniqSessionId": "2ce971c7-337b-0285-b569-a555659a1fc7",
"userDeviceInformation": {
"os": "Windows",
"osVersion": "10",
"browser": "Chrome",
"browserVersion": "76.0.3809.100",
"browserMajorVersion": 76,
"screen": "3840 x 1200",
"mobile": false
},
"mark": "show-main-image",
"startDate": "1566862710321",
"endDate": "1566862711787",
"duration": 1466.4250000023
}
startDate and endDate are epoch timestamps.
How do I query? (I see part of my problem is mixing up date formats)
I have tried:
{"startDate":{$gt:Date(1503781020)}}
I am testing using MongoDB Compass Community.
I just upgraded from MongoDB 3.6 to 4.2
Gina
The startDate and endDate are strings. Thus we need to convert the value to be searched to string before comparison.
The following is an example:
db.collection.findOne({"startDate":{$gte: 1566862710322+""}})
Ok, looks like you have to force it to convert to a long, so I multiplied by 1
{"startDate":{$gt:ISODate("2018-03-30T13:06:05.739-07:00")*1}}
Gina
From python point of view, you can use below code
db.collection.findOne({"startDate":{"$gte": str(1566862710322)}})
Hi Guys I am currently working on Gsuite Admin SDK Report API. I am successfully able to send the request and getting the response.
Now, the issue is that I am not able to identify the date format returned by the Activities.list().
Here is a snippet:
"events": [
{
"type": "event_change",
"name": "create_event",
"parameters": [
{
"name": "event_id",
"value": "jdlvhwrouwovhuwhovvwuvhw"
},
{
"name": "organizer_calendar_id",
"value": "abc#xyz.com"
},
{
"name": "calendar_id",
"value": "abc#xyz.com"
},
{
"name": "target_calendar_id",
"value": "abc#xyz.com"
},
{
"name": "event_title",
"value": "test event 3"
},
{
"name": "start_time",
"intValue": "63689520600"
},
{
"name": "end_time",
"intValue": "63689524200"
},
{
"name": "user_agent",
"value": "Mozilla/5.0"
}
]
}
]
Note: Please have a look at start_time and end_time and let me know if you guys have any idea about it.
Please have a look and share some info and let me know if any other infomation is needed.
I ran into this same question when parsing google calendar logs.
The time format they use are the number of seconds since January 1st, 0001 (0001-01-01).
I never found documentation where they referenced that time format. Google uses this instead of epoch for some of their app logs.
You can find an online calculator here https://www.epochconverter.com/seconds-days-since-y0
Use the one under "Seconds Since 0001-01-01 AD" and not the one under year zero.
So your start_time of "63689520600" converts to March 30, 2019 5:30:00 AM GMT.
If you want start_time to be in epoch, you could subtract 62135596800 seconds from the number. 62135596800 converts to January 1, 1970 12:00:00 AM when counting the number of seconds since 0001-01-01. Subtracting 62135596800 from the start_time would give you the number of seconds since January 1, 1970 12:00:00 AM AKA Epoch Time.
Hope this helps.
Is it possible to have Druid datasource with 2 (or multiple) timestmaps in it?
I know that Druid is time-based DB and I have no problem with the concept but I'd like to add another dimension with which I can work as with timestamp
e.g. User retention: Metric surely is specified to a certain date, but I also need to create cohorts based on users date of registration and rollup those dates maybe to a weeks, months or filter to only a certain time periods....
If the functionality is not supported, are there any plug-ins? Any dirty solutions?
Although I'd rather wait for official implementation for timestamp dimensions full support in druid to be made, I've found a 'dirty' hack which I've been looking for.
DataSource Schema
First things first, I wanted to know, how much users logged in for each day, with being able to aggregate by date/month/year cohorts
here's the data schema I used:
"dataSchema": {
"dataSource": "ds1",
"parser": {
"parseSpec": {
"format": "json",
"timestampSpec": {
"column": "timestamp",
"format": "iso"
},
"dimensionsSpec": {
"dimensions": [
"user_id",
"platform",
"register_time"
],
"dimensionExclusions": [],
"spatialDimensions": []
}
}
},
"metricsSpec": [
{ "type" : "hyperUnique", "name" : "users", "fieldName" : "user_id" }
],
"granularitySpec": {
"type": "uniform",
"segmentGranularity": "HOUR",
"queryGranularity": "DAY",
"intervals": ["2015-01-01/2017-01-01"]
}
},
so the sample data should look something like (each record is login event):
{"user_id": 4151948, "platform": "portal", "register_time": "2016-05-29T00:45:36.000Z", "timestamp": "2016-06-29T22:18:11.000Z"}
{"user_id": 2871923, "platform": "portal", "register_time": "2014-05-24T10:28:57.000Z", "timestamp": "2016-06-29T22:18:25.000Z"}
as you can see, my "main" timestamp to which I calculate these metrics is timestamp field, where register_time is only the dimension in stringy - ISO 8601 UTC format .
Aggregating
And now, for the fun part: I've been able to aggregate by timestamp (date) and register_time (date again) thanks to Time Format Extraction Function
Query looking like that:
{
"intervals": "2016-01-20/2016-07-01",
"dimensions": [
{
"type": "extraction",
"dimension": "register_time",
"outputName": "reg_date",
"extractionFn": {
"type": "timeFormat",
"format": "YYYY-MM-dd",
"timeZone": "Europe/Bratislava" ,
"locale": "sk-SK"
}
}
],
"granularity": {"timeZone": "Europe/Bratislava", "period": "P1D", "type": "period"},
"aggregations": [{"fieldName": "users", "name": "users", "type": "hyperUnique"}],
"dataSource": "ds1",
"queryType": "groupBy"
}
Filtering
Solution for filtering is based on JavaScript Extraction Function with which I can transform date to UNIX time and use it inside (for example) bound filter:
{
"intervals": "2016-01-20/2016-07-01",
"dimensions": [
"platform",
{
"type": "extraction",
"dimension": "register_time",
"outputName": "reg_date",
"extractionFn": {
"type": "javascript",
"function": "function(x) {return Date.parse(x)/1000}"
}
}
],
"granularity": {"timeZone": "Europe/Bratislava", "period": "P1D", "type": "period"},
"aggregations": [{"fieldName": "users", "name": "users", "type": "hyperUnique"}],
"dataSource": "ds1",
"queryType": "groupBy"
"filter": {
"type": "bound",
"dimension": "register_time",
"outputName": "reg_date",
"alphaNumeric": "true"
"extractionFn": {
"type": "javascript",
"function": "function(x) {return Date.parse(x)/1000}"
}
}
}
I've tried to filter it 'directly' with javascript filter but I haven't been able to convince druid to return the correct records although I've doublecheck it with various JavaScript REPLs, but hey, I'm no JavaScript expert.
Unfortunately Druid has only one time-stamp column that can be used to do rollup plus currently druid treat all the other columns as a strings (except metrics of course) so you can add another string columns with time-stamp values, but the only thing you can do with it is filtering.
I guess you might be able to hack it that way.
Hopefully in the future druid will allow different type of columns and maybe time-stamp will be one of those.
Another solution is add a longMin sort of metric for the timestamp and store the epoch time in that field or you convert the date time to a number and store it (eg 31st March 2021 08:00 to 310320210800)
As for Druid 0.22 it is stated in the documentation that secondary timestamps should be handled/parsed as dimensions of type long. Secondary timestamps can be parsed to longs at ingestion time with a tranformSpec and be transformed back if needed in querying time link.
Facebook recently added timezone less events (https://developers.facebook.com/roadmap/#timezone-less-events) to its Developer roadmap which says
"Since this migration was originally created, we have added a timezone field to events which indicates the name of the timezone (as defined here) where the event is expected to happen. FYI, developers reading time in ISO 8601 should be supporting the full standard when reading event times. Most events return local times (no GMT offset), but in the future events likely will return other formats (namely date-only and precise)."
It works for dates in ISO 8601 format but if I get dates in epoch format I always get +7 hrs difference.
e.g.
https://graph.facebook.com/369000383135224 returns
{
"id": "369000383135224",
"owner": {
"name": "Horst Uwe Peter",
"id": "1117563687"
},
"name": "Event in Dublin time 10:25",
"start_time": "2012-05-04T10:25:00",
"end_time": "2012-05-04T11:25:00",
"timezone": "Europe/Dublin",
"location": "Dublin, Ireland",
"venue": {
"id": "110769888951990"
},
"privacy": "FRIENDS",
"updated_time": "2012-05-04T09:27:29+0000",
"type": "event"
}
and
http://graph.facebook.com/369000383135224?date_format=U returns
{
"id": "369000383135224",
"owner": {
"name": "Horst Uwe Peter",
"id": "1117563687"
},
"name": "Event in Dublin time 10:25",
"start_time": 1336152300, <== Fri, 04 May 2012 17:25:00 GMT
"end_time": 1336155900, <== Fri, 04 May 2012 18:25:00 GMT
"timezone": "Europe/Dublin",
"location": "Dublin, Ireland",
"venue": {
"id": "110769888951990"
},
"privacy": "FRIENDS",
"updated_time": 1336123649,
"type": "event"
}
and with FQL using GRAPH end point
graph.facebook.com/fql?q=SELECT eid, name, description, location, venue, start_time, end_time, update_time, creator, privacy FROM event WHERE eid = 369000383135224
{
"data": [
{
"eid": 369000383135224,
"name": "Event in Dublin time 10:25",
"description": "",
"location": "Dublin, Ireland",
"venue": {
"id": 110769888951990
},
"start_time": 1336152300, <== Fri, 04 May 2012 18:25:00 GMT
"end_time": 1336155900, <== Fri, 04 May 2012 18:25:00 GMT
"update_time": 1336123649,
"creator": 1117563687,
"privacy": "FRIENDS"
}
]
}
does that mean migration works only for ISO 8601 formatted dates? and has no affect on FQL or epoch date format?
My events on a page I administer have never returned a timezone.
What I have found is that event times entered in the frontend dialog are treated as local times in "America/Los_Angeles" (complete with the US Daylight Savings Time changes, so you'll see +6 in winter and +7 in summer) and are then converted to "UTC" for storage in the database.
For display I use the following php function to show the correct times and note on the page that the times are local to the event's location:
function fb_event_time_convert($fb_time) {
$origin_dtz = new DateTimeZone('UTC');
$remote_dtz = new DateTimeZone('America/Los_Angeles');
$fb_time_str = '#' . $fb_time;
$origin_dt = new DateTime($fb_time_str, $origin_dtz);
$remote_dt = new DateTime($fb_time_str, $remote_dtz);
$offset = $origin_dtz->getOffset($origin_dt) - $remote_dtz->getOffset($remote_dt);
return $fb_time - $offset;
}