What is the largest timestamp parseable with DateTimeFormatter.ISO_DATE_TIME? - java-time

I thought that I could use Instant.MAX to represent the maximum timestamp possible but
DateTimeFormatter.ISO_DATE_TIME.parse(Instant.MAX.toString());
raises an exception:
java.time.DateTimeException: Invalid value for Year (valid values -999999999 - 999999999): 1000000000
I understand that for the specific case of Instant.MAX I could use:
Instant.from(DateTimeFormatter.ISO_INSTANT.parse(Instant.MAX.toString()))
but I do really need to use DateTimeFormatter.ISO_DATE_TIME and I would rather use a constant that represent the maximum timestamp for that parser.
I have two questions really
Is there some stardard constant that I can use to represent the largest timestamp that is still parseable by DateTimeFormatter.ISO_DATE_TIME
What is the rationale for Instant.MAX being outside the valid range of DateTimeFormatter.ISO_DATE_TIME?

In the case of DateTimeFormatter.ISO_DATE_TIME the maximum and minimum timestamps are OffsetDateTime.MAX and OffsetDateTime.MIN.
OffsetDateTime.MAX.toString() // +999999999-12-31T23:59:59.999999999-18:00
OffsetDateTime.MIN.toString() // -999999999-01-01T00:00+18:00
import java.time.Instant;
import java.time.format.DateTimeFormatter;
import java.time.OffsetDateTime;
Instant.from(DateTimeFormatter.ISO_DATE_TIME.parse(OffsetDateTime.MAX.toString()));

Related

Arrow date in a pandas column

I am using arrow to get the dates of a single dataframe that has the following structure:
data=['2015', '2016','2017', '2108']
df= pd.DataFrame(data,columns=['time'])
I know that to get the date in arrow is with the following code:
arrow.get('2016')
Have tried to use this:
arrow.get(df['time'])
But it gives me this error: Cannot parse single argument of type <class 'pandas.core.series.Series'>.
How to tell arrow to use the column?
Thanks
Convert the entire series for access later
One option is to use pandas apply on the column. https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.apply.html
df.time = df.time.apply(lambda tm: arrow.get(tm))
Might be a way to do this with converters on load as well, depending on where you are loading from. csv docs for example, https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html
I also wonder why you are using arrow time versus pandas built in datetime type. Once again, depending on how you are loading this data, dtypes could be used to specify datetime.
Convert one value from the series
You need to choose one value instead of providing all values (i.e. the pd.Series)
arrow.get(df.time[1]) would convert 2016 in your example.

Tableau Decimal to String

I have a field which is a Number(decimal) here's one such example value:
1005.44
now if I try to convert that to a string to simply get me what is seen; I'd expect STR = 1005.44
but instead I get:
1005.4400000000001
thats from STR(ROUND(([FIELD]),2))
I see some other posts with similar issues but no found resolution?
This artifact isn't unique to Tableau and stems from how underlying databases store floating point numbers and deal with functions like rounding.
You should try the following:
str(int([FIELD]*100)/100)
This will multiply the number by 100, convert to an int, divide by 100, and then convert to a string.

DateFormat adding yy-mm-dd to DateTime when I specifically called for hh:mm

The code is as follows:
DateTime time = DateFormat("hh:mm").parse(doc[i]["Time"]);
The doc[i]["Time"] value is supposed to be a military time like 19:42. I followed the guidelines about using DateFormat but the end result variable time contains "1970-01-01 19:42:00.000". Is there a way for time to just contain "19.42"?
To get time in 24hrs format change your code to
String time = DateFormat("HH:mm").format(doc[i]["Time"]);
H -> is for 24 hours format
It is not possible to use DateTime to show a specific format, must always become a String.
A DateTime type will always store everything including year, month, hour, minutes, seconds, milliseconds, etc.
But when you later use the variable, you can chose what to display.
For example, you can chose to display just the military time. Here is one example using the Intl package (here)
Text(DateFormat('Hm').format(yourVariable))
Don't forget to import
import 'package:intl/intl.dart';

How can I create time in proper format to export to a netCDF file in MATLAB?

Data
I am trying to create a time dimension using this:
t1 = datetime(1901,1,1);
t2 = datetime(2016,12,31);
t = t1:t2;
And create a netCDF file using this
nccreate('prec.nc','Prec',...
'Dimensions',{'time' 42369 'lon' 135 'lat' 129},...
'Format', 'netcdf4');
What I have tried
ncwrite('prec.nc', 'time', t);
Error Message
Error using cast
Unsupported data type for conversion: 'datetime'.
Error in internal.matlab.imagesci.nc/write (line 778)
scale_factor = cast(1, class(varData));
Error in ncwrite (line 87)
ncObj.write(varName, varData, start, stride);
Question
How can I create a daily time dimension that I can write out to a netCDF file? What is the proper date type for this conversion?
NetCDF doesn't define a single native way of storing date/time values, but there are established conventions, as desribed here.
There are two strategies for storing a date/time into a netCDF variable. One is to encode it as a numeric value and a unit that includes the reference time, e.g. "seconds since 2001-1-1 0:0:0" or "days since 2001-1-1 0:0:0" . The other is to store it as a String using a standard encoding and Calendar. The former is more compact if you have more than one date, and makes it easier to compute intervals between two dates.
So you could:
a) Use datestr to convert it to a string value. The conventional date string format for data interchange is ISO 8601, which you can get in Matlab with datestr(myDateTime, 'yyyy-mm-ddTHH:MM:SS').
b) Convert it to a numeric value representing seconds or days since a reference "epoch" time. I'd suggest using the Unix epoch, since Matlab provides a convenient conversion function for this already: posixtime(myDateTime). Then specify your units for that variable in the NetCDF file as 'seconds since 1970-01-01 00:00:00'.
You probably want to make sure your datetimes are in UTC before encoding them in the NetCDF, so you don't have to worry about time zone issues.

MongoImport Dates Occurring Before The Epoch

I am writing a utility at work which converts our relational DB at work to a complex JSON object and dumps to files grouped by subject. I then would like to import these files into MongoDB collections using the mongoimport tool.
Our data includes timestamps which represent dates occurring before the epoch, the appropriate JSON representation of which yields negative numbers. While MongoDB itself will handle these fine, the import tools JSON parser uses unsigned long long variables and fails.
If you use Mongo's special JSON date representation format ({"key": { "$date": "value_in_ticks" } }), the import tool will throw an error on those documents and skip the import. You can also use the JavaScript date notation ({"key": new Date(value_in_ticks) }) which will be successfully imported but parsed as an unsigned value creating a garbage date.
The special date format fails because of an assertion checking for reserved words. This code is reached because the presence of the negative sign at the beginning of the value causes the special date parsing to exit and return to normal document parsing.
The code to parse JSON dates explicitly calls the boost library uint_parser. There exists a signed version of this function and an issue on their JIRA tracker already exists to utilize it (on which I commented that I would attempt).
Short of diving into the code immediately to try and update this to be signed, is there an alternate route that I can take to load these dates for now?
I want to run this nightly via cron for a few months for testing so I would prefer it be very easy. These dates exist in many different parts of documents in many different collections so the solution should be generalized.
A little late to the party, but I have just come up against the same issue.
My workaround was to import the dates as strings (e.g. "1950-01-01"), and to script the conversion using Ruby on Rails with Mongoid:
Dates.each do |d|
d.mydate = d.mydate.to_date
d.save
end
Hopefully you can adapt this to whatever language/framework you are using.
This Python snippet works for me.
import time, struct
def bson_datetime(adatetime):
try:
ret = int(1000*(time.mktime(adatetime.timetuple()) + 3600))
if ret < 0:
ret = struct.unpack('Q', struct.pack('q', ret))[0]
return {'$date': ret}
except ValueError:
return None
I.e.
import datetime
print bson_datetime(datetime.datetime(1950, 12, 30, 0, 0))
yields {"abc" : {"$date" : 18446743473920751616}}.
Step 1: go to groups.google.com/group/mongodb-user and post the issue "mongoimport does not support dates before the epoch". Response times on the groups tend to be very good.
Step 2: think of running dates in a universally accepted format like "1964-04-25 13:23:12"
It will take a little bit more space in MongoDB because you'll be storing string. However it should be easy to interpret for anyone pulling out the data.