Storing an array of dates (without times) - postgresql

I'm trying to store dates (without times) in a PostgreSQL array.
For context, let's say these are dates on which a supermarket will be closed:
['2022-12-24', '2022-12-25', '2022-12-26']
The table is called opening_times and there is a column called closed_days which has a type of date[]:
table.specificType('closed_days', 'date[]').defaultTo([]).notNullable()
However, when I UPDATE this field using SET closed_days = '{"2022-10-16"}', it seems PostgreSQL is converting it into a full ISO date and time string like this: ["2022-10-15T23:00:00.000Z"]
According to the PostgreSQL docs (section 8.5), the date type is supposed to have a resolution of 1 day, but somehow it is still storing the time. Elsewhere in the database, fields of type date do indeed have a granularity of 1 day (time is not stored), but in these instances I am not using an array.
--
Additional information
I am running PostgreSQL 14.2 inside a Docker container (psql is also running inside the container)
The type of the column in Beekeeper Studio shows as _date, but you can see the ORM code above that was used to create the field using type date[], so I assume _date is just another notation for the same.
In psql, running \d opening_times shows that the column has a type of date[].
The result of select array['2022-07-28'::date] is ["2022-07-27T23:00:00.000Z"] when run in Beekeeper Studio. When the same query is run in psql I get {2022-07-28}. When I run it in Lucid ORM:
const db = await Database.rawQuery("select array['2022-07-28'::date]")
console.log(db.rows)
I get this: [ { array: [ 2022-07-27T23:00:00.000Z ] } ].

use moment library
example :
moment(new Date()).format("YYYY-MM-DD HH:mm:ss")

Related

Date fields not getting loaded from source/expression to target while using odbc connection

I have columns like a int, b varchar,c timestamp in my table x (redshift) and am trying to load those three columns into another table b(redshift) by using a mapping m1, in that am using odbc connection's.
Issue is am able to load the data for all the columns except date fields (c timestamp) whether those are from src or expression.
In place of date null values are storing.
Mapping ran successful without any issue/warning.
Note: am using odbc connection's because I need to call stored procedure in Post sql.
thanks for your response.
As I modified the lookup transformation then I resolved the issue.
In lookup mapping I just modified the multiple matches to return all rows then the date fields are getting loaded from source/expression transformation to target.

Hive - the correct way to permanently change the date and type in the entire column

I would be grateful if someone could explain here step by step what the process of changing the date format and column type from string to date should look like in the table imported via Hive View to HDP 2.6.5.
The data source is the well-known MovieLens 100K Dataset set ('u.item' file) from:
https://grouplens.org/datasets/movielens/100k/
$ hive --version is: 1.2.1000.2.6.5.0-292
Date format for the column is: '01-Jan-1995'
Data type of column is: 'string'
ACID Transactions is 'On'
Ultimately, I would like to convert permanently the data in the entire column to the correct Hive format 'yyyy-MM-dd' and next column type to 'Date'.
I have looked at over a dozen threads regarding similar questions before. Of course, the problem is not to display the column like this, it can be easily done using just:
SELECT from_unixtime(unix_timestamp(prod_date,'dd-MMM-yyyy'),'yyyy-MM-dd') FROM moviesnames;
The problem is to finally write it down this way. Unfortunately, this cannot be done via UPDATE in the following way, despite the inclusion of atomic operations in Hive config.
UPDATE moviesnames SET prodate = (select to_date(from_unixtime(UNIX_TIMESTAMP(prod_date,'dd-MMM-yyyy'))) from moviesnames);
What's the easiest way to achieve the above using Hive-SQL? By copying and transforming a column or an entire table?
Try this:
UPDATE moviesnames SET prodate = to_date(from_unixtime(UNIX_TIMESTAMP(prod_date,'dd-MMM-yyyy')));

Unable to create db2 table with DATE data type

I am using DB2 9.7 (LUW) in a windows server, in which multiple DBs are available in a single DB instance. I just found that in one of these DBs, I am unable to add a column with DATE data type, during table creation or altering. The column been added is getting changed to timestamp instead.
Any help on this will be welcome.
Check out your Oracle Compatibility setting
Depending on that setting a date is interpreted as Timestamp(0) like in your example.
Because these settings take effect if the database has been created after setting the DB2_COMPATIBILITY_VECTOR registry variable your database can show a different behaviour.

Insert yyyyMMdd string into date column using Talend

I have the follow situation:
A PostgreSQL database with a table that contains a date type column called date.
A string from a delimited .txt file outputting: 20170101.
I want to insert the string into the date type column.
So far i have tried the following with mixed results/errors:
row1.YYYYMMDD
Detail Message: Type mismatch: cannot convert from String to Date
Explanation: This one is fairly obvious.
TalendDate.parseDate("yyyyMMdd",row1.YYYYMMDD)
Batch entry 0 INSERT INTO "data" ("location_id","date","avg_winddirection","avg_windspeed","avg_temperature","min_temperature","max_temperature","total_hours_sun","avg_precipitation") VALUES (209,2017-01-01 00:00:00.000000 +01:00:00,207,7.7,NULL,NULL,NULL,NULL,NULL) was aborted. Call getNextException to see the cause.
can see the string parsed into "2017-01-01 00:00:00.000000 +01:00:00".
When I try to execute the query directly i get a "SQL Error: 42601: ERROR: Syntax error at "00" position 194"
Other observations/attempts:
The funny thing is if I use '20170101' as a string in the query it works, see below.
INSERT INTO "data" ("location_id","date","avg_winddirection","avg_windspeed","avg_temperature","min_temperature","max_temperature","total_hours_sun","avg_precipitation") VALUES (209,'20170101',207,7.7,NULL,NULL,NULL,NULL,NULL)
I've also tried to change the schema of the database date column to string. It produces the following:
Batch entry 0 INSERT INTO "data" ("location_id","date","avg_winddirection","avg_windspeed","avg_temperature","min_temperature","max_temperature","total_hours_sun","avg_precipitation") VALUES (209,20170101,207,7.7,NULL,NULL,NULL,NULL,NULL) was aborted. Call getNextException to see the cause.
This query also doesn't work directly because the date isn't between single quotes.
What am i missing or not doing?
(I've started learning to use Talend 2-3 days ago)
EDIT//
Screenshots of my Job and tMap
http://imgur.com/a/kSFd0
EDIT//It doesnt appear to be a date formatting problem but a Talend to PostgreSQL connection problem
EDIT//
FIXED: It was a stupid easy problem/solution ofcourse. THe database name and schema name fields were empty... so it basically didnt know where to connect
You don't have to do anything to insert a string like 20170101 into a date column. PostgreSQL will handle it for you it's just ISO 8601's date format.
CREATE TABLE foo ( x date );
INSERT INTO foo (x) VALUES ( '20170101' );
This is just a talend problem, if anything.
[..] (209,2017-01-01 00:00:00.000000 +01:00:00,207,7.7,NULL,NULL,NULL,NULL,NULL)[..]
If Talend doesn't know by itself that passing timestamp into query requires it to be single quoted, then if possible - you need to do it.
FIXED: It was a stupid easy problem/solution ofcourse. THe database name and schema name fields were empty... so it basically didnt know where to connect thats why i got the BATCH 0 error and when i went deeper while debugging i found it couldnt find the table, stating the relation didnt exist.
Try like this,
The data in input file is: 20170101(in String format)
then set the tMap like,
The output is as follows:

Using the Time data type in Postgres with a Microsoft Access Front-end

I have a field in my postgres database using the time (without time zone) data type. I have a Microsoft Access front-end for the database connected using psqlODBC, which reads this field as a "Date/Time" data type.
If I try to insert something into the field through the front end, I get the following error:
ODBC - insert on a linked table "table_name" failed.
ERROR: column "column_name" is of type time without time zone but expression is of type date;
I'm assuming that access is trying to input a time stamp instead.
Basically my question is it even really possible to use the time data type with Access? Or should I just be using the timestamp datatype instead?
If you are manually typing data into a linked table then no this won't be possible at present, if you have the option of updating your table via forms or VB then you could try this to get access to produce only a time value:
TimeSerial(Hour(Now()), Minute(Now()), Second(Now()))
Otherwise as you say, it's probably a good idea to change your data type to timestamp.