I have the following Command table in a postgresql 9.3 database:
Command
-------------------------------------------------------------
id | purchased_date | ...
integer NOT NULL | timestamp without time zone NOT NULL | ...
Now I want to fill my table with data from a CSV file. My CSV file contains for example the following values:
1,"2016-04-18 09:37:30"
2,"2016-04-17 09:37:30"
...
When I do \i 'my_csv_file.csv' it works great. What I want to do now is to have the CSV file with dynamic dates in order to not regenerate the CSV file when I want to reload my database (this is for test purposes). I would like to have something like:
1,"CURRENT_DATETIME - INTERVAL '1 DAY'"
2,"CURRENT_DATETIME - INTERVAL '2 DAY'"
But when I execute the same command \i 'my_csv_file.csv' I have the error ERROR: date/time value "current" is no longer supported. Is that possible to do what I want ?
You can use temporary table with offsets, then insert from it.
CSV:
1,"INTERVAL '1 DAY'"
2,"INTERVAL '2 DAY'"
Queries:
create temporary table tmp_table (
id int,
offset_ interval
);
Import csv into this table, then
insert into main_table (id, purchased_date )
(select id, now()-offset_ from tmp_table);
Related
I have an table with millions of rows.
I have column called time_from_device (which is of type timezone with time stamp)
id | name | t1 | t2 | t3 | time_from_device |
---------------------------------------------
Now i want to add a column called created_at whose value will be now()
But before i set the default value of created_at as now(), I want to fill the existing rows created_at with time_from_device - INTERVAL '1 hour'
So I am doing the following
ALTER TABLE devicedata ADD COLUMN "created_at" timestamp with time zone;
This creates a new column created_at with NULL values
Now i want to fill the column with time values from time_from_device - INTERVAL '1 hour'
UPDATE devicedata SET created_at = time_from_device - INTERVAL '1 hour';
Since there are millions of rows, this command just hangs
How can I know whether its working or not
The way you are doing this is correct. Just be patient.
One potential problem could be row locks by concurrent long running transactions. Make sure that there are no such transactions.
You can examine the wait_event in the pg_stat_activity row of the corresponding session: if that is NULL, your query is happily working.
To speed up the operation, you could drop all indexes and constraints before updating the table and then create them again – your operation will probably require down time anyway.
The size of a transaction is no performance problem in PostgreSQL.
I have data as following, But the column type is Varchar:
2019-09-28T23:59:59.52Z
I assume 52 here is milli seconds, If so..
I would like to convert it as following and change the column type to timestamp:
2019-09-28 23:59:59.52
Can someone let me know how I can convert in postgreSQL?
EDIT:
I can see data in table as (since the column type is varchar):
2019-09-28T23:59:59.52Z
Instead, I want data in the table to be shown as:
2019-09-28 23:59:59 ( and may be .52, if possible)
I need to change the column type to timestamp as well, I guess, Please help with that too.
Answer:
Tim has provided a solution, You can follow that.
But, In my case, It is prod env, So, I have just changed the type using:
ALTER TABLE my_table ALTER COLUMN my_column TYPE TIMESTAMP USING my_column::timestamp without time zone;
Thanks
Your timestamp string literal is already in a format which can be directly cast in Postgres:
SELECT '2019-09-28T23:59:59.52Z'::timestamp; -- 2019-09-28 23:59:59.52
As a test, let's add one day to make sure it's working:
SELECT '2019-09-28T23:59:59.52Z'::timestamp + interval '1 day';
-- 2019-09-29 23:59:59.52
If you want to actually add a new timestamp column using string data from another column, then try:
ALTER TABLE yourTable ADD COLUMN new_ts TIMESTAMP;
UPDATE yourTable SET new_ts = old_ts::timestamp;
ALTER TABLE yourTable DROP COLUMN old_ts;
ALTER TABLE yourTable RENAME COLUMN new_ts TO old_ts; -- optional
The last ALTER statement is optional if you want the new bona fide timestamp column to bear the same name as the old text timestamp column.
I want to put in oozie some sqoop commands in order to be executed everyday and fetch data for previous date:
The table has a column date_prof and it has values like:
2020-09-02 05:03:02
2021-02-19 06:04:15
2021-02-10 19:05:20
etc...
Because its timestamp I am trying to have only the yyyy-MM-dd to get only the date, so my query inside sqoop is like:
select * from table date_prof like 'from_uixtime(date_sub(current_date,1),'yyyy-MM-dd')%'
But because of the '' around the function it reads it as string.
convert date_prof to date:
select * from table where date(date_prof) = date_sub(current_date,1)
I'm trying to convert the column, and preserve the data in the specified timezone. I have a script, but it errors out on the SELECT at the end.
ALTER TABLE schema.table
ALTER COLUMN column_date TYPE TIMESTAMP WITH TIME ZONE
USING column_date AT TIME ZONE (SELECT value FROM schema.table WHERE id = 'timezone');
ERROR: cannot use subquery in transform expression
I have time zones stored, but trying to pull that back to apply to the script is posing a challenge. I know I can simply hardcode the timezone as 'EST|CST|PST', etc. but I have multiple databases this needs to be applied to (with multiple alters per DB), hence the required SELECT at the end. Is there a way to accomplish this?
I've poured over a few questions on this subject; while some came close, they don't quite meet the needs (My apologies if this is a duplicate, I've spent a while searching for an answer).
[RESOLVED]: Using 2 separate queries
ALTER TABLE schema.table
ALTER COLUMN column_date TYPE TIMESTAMP WITH TIME ZONE;
UPDATE schema.table
SET column_date = column_date AT TIME ZONE
(SELECT value FROM schema.table WHERE id = 'timezone');
You can use dynamic SQL:
DO
$$BEGIN
EXECUTE format('ALTER TABLE ... AT TIME ZONE %L',
(SELECT value FROM atable ...));
END;$$;
Hi as the title says I have a column named healthtime in my table which is of type timestamp without timezone i.e.
Healthime
2012-02-02 08:15:00
I would like to split this into two new columns in the same table, one date column and one time column i.e.
Date Time
2012-02-02 08:15:00
Please can someone advise me how to do this (preferably in a single query),
Thanks,
James
Not one query, but one transaction... should be fine as well:
My table looks like:
CREATE TABLE d (t timestamp);
Code for changing the table:
BEGIN;
ALTER TABLE d ADD COLUMN dat date;
ALTER TABLE d ADD COLUMN tim time;
UPDATE d
SET
dat = t::date,
tim = t::time;
ALTER TABLE d DROP COLUMN t;
COMMIT;
SELECT Healthime::date from tbl_yourtable as date
SELECT Healthime::time(0) from tbl_yourtable as time
PostgreSQL Version: >= 10
select fake_table.healthtime
, cast(to_char(fake_table.healthtime, 'YYYY-MM-DD') as date) as "Date"
, cast(to_char(fake_table.healthtime, 'HH24:MI:SS') as time) as "Time"
from (select current_timestamp as "healthtime") as "fake_table"
output:
healthtime Date Time
timestamp with time zone date time without time zone
========================== ========== ======================
2022-04-22 14:20:25.678-04 2022-04-22 14:20:25
tested on PostgreSQL 9.3.25, compiled by Visual C++ build 1600, 64-bit running on Windows 8.1