Create new Date column of DATE type from existing Date column of TEXT type in PostgresSQL - postgresql

I have a PostgresSQL table that has a Date column of type TEXT
Values look like this: 2019-07-19 00:00
I want to either cast this column to type DATE so I can query based on latest values, etc.... or create a new column of type DATE and cast into there (so I have both for the future). I would appreciate any advice on both options!
Hope this isn't a dupe, but I havn't found any answers on SO.
Some context, I will need to add more data later on to the table that only has the TEXT column, which is why i want to keep the original but open to suggestions.

You can alter the column type with the simple command:
alter table my_table alter my_col type date using my_col::date
This seems to be the best solution as maintaining duplicate columns of different types is a potential source of future trouble.
Note that all values in the column have to be null or be recognizable by Postgres as a date, otherwise the conversion will fail.
Test it in db<>fiddle.
However, if you insist on creating a new column, use the update command:
alter table my_table add my_date_col date;
update my_table
set my_date_col = my_col::date;
Db<>fiddle.

Related

Is there a way to convert a varchar column in dd-mm-yyyy format to a date column in yyyy-mm-dd format in postgresql?

I am working on PostgreSQL.
I have a column named curr_date in my table. The datatype previously assigned to it is varchar but the column stores dates in the format dd-mm-yyyy.
Now I want to change its datatype to date but in order to do that i first have to convert all the values in the column in dd-mm-yyy format to yyy-mm-dd format.
Only then can I use the query alter table alter column curr_date type date using curr_date::date;
so is there a way to convert this format. i am open to using dummy column to make the changes too.
You can do that in a single statement:
ALTER TABLE mytable
ALTER col TYPE date USING to_date(col, 'DD-MM-YYYY');
That will explicitly convert the data from the old format to the new format.
A change like this will cause the table to be rewritten, which can take a while if the table is large. During that time, the table is inaccessible even for SELECT statements.

How to add default value 0000-00-00 to date datatype in postgresql?

I previously changed data type of my column by using below command but now I want to add default value like 0000-00-00 for the same column, can any one help me?
alter table table name alter column name type date using(column name::date)
You have an issue if the column is a date because 0000-00-00 is not a valid date. The syntax for setting the default is:
alter table t alter column col set default '0001-01-01';
However, you need a valid date for that. I would recommend just using NULL if that works for your application.

Alter a Column from INTEGER To BIGINT

In my database I have several fields with INTEGER Type. I need to change some of them to BIGINT.
So my question is, can I just use the following command?
ALTER TABLE MyTable ALTER COLUMN MyIntegerColumn TYPE BIGINT;
Are the contained data be converted the correct way? After the convert is this column a "real" BIGINT column?
I know this is not possible if there are constraints on this column (Trigger, ForeingKey,...). But if there are no constraints is it possible to do it this way?
Or is it better to convert it by a Help-Column:
MyIntegerColumn -> MyIntegerColumnBac -> MyBigIntColumn
When you execute
ALTER TABLE MyTable ALTER COLUMN MyIntegerColumn TYPE BIGINT;
Firebird will not convert existing data from INTEGER to BIGINT, instead it will create a new format version for the table.
When inserting new rows or updating existing rows, the value will be stored as a BIGINT, but when reading Firebird will convert 'old' rows on the fly from INTEGER to BIGINT. This happens transparently for you as the user. This is to prevent needing to rewrite all existing rows, which could be costly (IO, garbage collection of old versions of rows, etc).
So please, do use ALTER TABLE .. ALTER COLUMN, do not do MyIntegerColumn -> MyIntegerColumnBac -> MyBigIntColumn. There are some exceptions to this rule, eg (potentially) lossy character set transformations are better done that way to prevent transliterations errors on select if a character does not exist in the new character set, or changing a (var)char column to be shorter (which can't be done with alter column).
To be a little more specific: when a row is written in the database it contains a format version (aka version count) of that row. The format version points to a description of a row (datatypes, etc) how Firebird should read that row. An alter table will create a new format version, and that format will be applied when writing new rows or updating existing rows. When reading an old row, Firebird will apply necessary transformation to present that row as the new format (for example adding new columns with their default values, transforming a data type of a column).
These format versions are also a reason why the number of alter tables are restricted: if you apply more than 255 alter tables on a single table you must backup and restore the database (the format version is a single byte) before further changes are allowed to that table.

Importing csv into Postgres database with improper date value

I have a query which has a date field with values that look like this in the query results window:
2013-10-01 00:00:00
However, when I save the results to csv, it gets saved like this:
2013-10-01T00:00:00
This is causing a problem when I'm trying to COPY the csv into a table in Redshift, where it gives me an error stating that the value is not a valid timestamp (the field I'm importing to is a timestamp field).
How can I get it so that it either strips out the time component completely, leaving just the date, or at least that the "T" is removed from the results?
I'm exporting results to csv using Aginity SQL Workbench for Redshift.
According to this knowledgebase article:
After import, add new TIMESTAMP columns and use the CAST() function to
populate them:
ALTER TABLE events ADD COLUMN received_at TIMESTAMP DEFAULT NULL;
UPDATE events SET received_at = CAST(received_at_raw as timestamp);
ALTER TABLE events ADD COLUMN generated_at TIMESTAMP DEFAULT NULL;
UPDATE events SET generated_at = CAST(generated_at_raw as timestamp);
Finally, if you forsee no more imports to this table, the raw VARCHAR
timestamp columns may be removed. If you forsee importing more events
from S3, do not remove these columns. To remove the columns, run:
ALTER TABLE events DROP COLUMN received_at_raw; ALTER TABLE events
DROP COLUMN generated_at_raw;
Hope that helps...

Permanently convert datetime to date

I have a postgres database with a datetime field. I'd like to permanently convert it to a date field (i.e. remove the time element). What is the best way to go about that? Is there a way to do it in place without having to dump the table into a new table?
Assuming you mean a timestamp with "datetime".
ALTER TABLE foo ALTER COLUMN bar TYPE date;
Create new temporary field and UPDATE yourtable SET newfield = oldfield::date with proper conversion function and then just remove old field and rename the new field.