I have to read dates from a database using JPA. I'm using EclipseLink and MySQL.
The date is stored in the database as varchar(10) in the format YYYY-MM-DD. It's not possible changing the database column type.
Is there a way to map this column to some dates type (DATE, CALENDAR)?
Is it possible to force convertion?
With EclipseLink you can use a Converter.
See,
http://www.eclipse.org/eclipselink/documentation/2.4/jpa/extensions/a_converter.htm#CHDEHJEB
A simple #TypeConverter may work, otherwise you can create your own Converter.
Related
I have Excel data and trying to insert the data into MongoDB using Talend Big Data for Open Studio. This is my job,
tFileInputExcel --> tMap --> tMongoDBOutput
In excel sheet, i have a date value column in this format 7/13/2017(MM/dd/yyyy) as string type and I am trying to insert this column value as ISO format ISODate("2017-07-13T00:00:00.000Z") in MongoDB.
This is my Job:
tFileInputExcel:
tMap:
tMongoDBOutput:
When execute this job, I'm getting the below error.
Error:
When i change the parse format like this TalendDate.parseDate("MM/dd/yyyy",row1.ClosingDate) , I'm getting SimpleDateFormat error. Simple Date Format Error
How to resolve this issue?
you can do simply if you mongodb column schema is date:
TalendDate.parseDate("MM/dd/yyyy",row3.newColumn)
That will automatically convert the date in the date model that your mongoDB column have.
You can change in your schema in Talend the date Model like "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'".
This is a very common mistake doing in reading data without understanding the underlying data types.
I have blogged about this especially for Talend: https://www.tobiasmaasland.de/2017/07/20/using-date-in-talend-etl-jobs/
But let me explain a bit.
Sometimes Excel tries to convert data in the cell even if one might think the cell type is set to String. Insted, it is set to Date. As such, no conversion is needed and the type needs to be Date in the input component.
If it is a String and an error occurs, the the structure of the String is either not the same everywhere or some cells are empty (null). So you might be lucky with
TalendDate.parseDate("MM/dd/yyyy", (row1.ClosingDate == null), "01/01/1970", row1.ClosingDate)
I just assumed you might want to use a placeholder date insted of having null.
This heavily depends on the actual data type in the cells, if every cell has the same data type and if all the data is formatted correctly.
To sum up one of the facts in my blog post: Don't use String for dates. Use Date for dates in Excel. It makes everything easier.
I am new to PostgreSql.
I want to store date as "YYYYMM" in my date column.
So, which data type should I use for my column? I tried finding it but didn't get. Any idea?
I'm new to PostgreSQL and I have the following question:
I have a table with just an id-column and a data-column, which uses the jsonb-type. Inside the jsonb-object I have a datetime field. I read in various posts, that I should use the ISO-8601 dateformat to store in the DB.
I want to filter my table by date like this:
SELECT * FROM table WHERE data->'date' > '2016-01-01T00:00'
Is this really the best date-format for this purpose?
Thanks in advance :)
IMHO Your query should produce
ERROR: operator does not exist: jsonb > timestamp with time zone
If I get it right. In case you change -> to ->> it should get a text value instead of jsonb field (which is also not comparable to timestamp).
It should be smth like
SELECT * FROM table WHERE (data->>'date')::timestamptz > '2016-01-01T00:00' to work
The big advantage of that format is that string order corresponds to date order, so a comparison like the one you quote in your question would actually work as intended.
A second advantage is that a timestamp in that format can easily be converted to a PostgreSQL timestamp with time zone value, because the type input function understands this format.
I hope you are not dealing with dates “before Christ”, because it wouldn't work so easily with those.
I am reading a csv file with date fields of formatted mm/dd/yyyy. I expected the same kind of format from a Postgres table after the import, but I see yyyy-mm-dd hh:mm:ss.
The date fields in my table are defined as timestamp without time zone data type.
How do I maintain the same format of data? I am using PostgreSQL 9.3.
Postgresql only stores the value, it doesn't store formatting (which would waste space).
You can use the to_char function in your query if you like to get the output formatted in a special way. Details are in the manual.
I have never used DBIx::Class until today, so I'm completely new at it.
I'm not sure if this is possible or not, but basically I have a table in my SQLite database that has a timestamp column in it. The default value for the timestamp column is "CURRENT_TIMESTAMP". SQLite stores this in the GMT timezone, but my server is in the CDT timeszone.
My SQLite query to get the timestamp in the correct timezone is this:
select datetime(timestamp, 'localtime') from mytable where id=1;
I am wondering if it is possible in my DBIx schema for "MyTable" to force it to apply the datetime function every time it is retrieving the "timestamp" field from the database?
In the cookbook it looks like it is possible to do this when using the ->search() function, but I am wondering if it's possible to make it so if I'm using search(), find(), all(), find_or_new(), or any function that will pull this column from the database, it will apply the datetime() SQLite function to it?
DBIx::Class seems to have great documentation - I think I'm just so new at it I'm not finding the right places/things to search for.
Thanks in advance!
I've used InflateColumn::DateTime in this way and with a timestamp, and I can confirm it works, but I wonder if you have this backward.
If your column is in UTC, mark the column UTC, and then it should be a UTC time when you load it. Then when you set_timezone on the DateTime (presumably that would be an output issue - it's at output that you care it's locally zoned) you can set it to local time and it will make the necessary adjustment.