How to upload a data including "timestamp" in postgres - postgresql

I am using Postgres, but every time when I try to upload data including timestamp type to one of its columns, Postgres gives me this error message :
ERROR: date/time field value out of range: "21/04/2020 18:19"
HINT: Perhaps you need a different "datestyle" setting.
CONTEXT: COPY get_safe, line 2, column joined_getsafe_at: "21/04/2020 18:19"
how to upload the data including timestamp type successfully in Postgres?

Related

Azure Data Factory Copy Activity Pipeline Destination Mapping String Format Date to Sql Date column Warning

I am doing copy activity to load the data from azure data factory to on premise SQL table.
I could see in copy activity column Mapping, there is warning message like source column is string with date and time value (2022-09-13 12:53:28) so that i created target SQL table column is date data type.
While import mapping in copy activity, i could see the whatever date column i mapped in SQL. there is warning message throwing in ADF. kindly advise, how do we resolve it.
screenshot:
The warning just indicates that it copy data will truncate source column data when additional data information is found in a column value. There would not be any error in this case but there might be data loss.
In your case, since the column value is 2022-09-13 12:53:28, it will be inserted without any issue into the datetime column without truncation.
The following is a demonstration where I try to insert the following source data:
id,first_name,date
1,Wenona,2022-09-13 12:53:28
2,Erhard,2022-09-13 13:53:28
3,Imelda,2022-09-13 14:53:28
The copy activity runs successfully and inserts the data. The following is my target table data after inserting:
When I insert the following data, it would be truncated to just include a precision of 2 digits of milli seconds as shown below.
id,first_name,date
1,Wenona,2022-09-13 12:53:28.11111
2,Erhard,2022-09-13 13:53:28.11111
3,Imelda,2022-09-13 14:53:28.11111

Storing an array of dates (without times)

I'm trying to store dates (without times) in a PostgreSQL array.
For context, let's say these are dates on which a supermarket will be closed:
['2022-12-24', '2022-12-25', '2022-12-26']
The table is called opening_times and there is a column called closed_days which has a type of date[]:
table.specificType('closed_days', 'date[]').defaultTo([]).notNullable()
However, when I UPDATE this field using SET closed_days = '{"2022-10-16"}', it seems PostgreSQL is converting it into a full ISO date and time string like this: ["2022-10-15T23:00:00.000Z"]
According to the PostgreSQL docs (section 8.5), the date type is supposed to have a resolution of 1 day, but somehow it is still storing the time. Elsewhere in the database, fields of type date do indeed have a granularity of 1 day (time is not stored), but in these instances I am not using an array.
--
Additional information
I am running PostgreSQL 14.2 inside a Docker container (psql is also running inside the container)
The type of the column in Beekeeper Studio shows as _date, but you can see the ORM code above that was used to create the field using type date[], so I assume _date is just another notation for the same.
In psql, running \d opening_times shows that the column has a type of date[].
The result of select array['2022-07-28'::date] is ["2022-07-27T23:00:00.000Z"] when run in Beekeeper Studio. When the same query is run in psql I get {2022-07-28}. When I run it in Lucid ORM:
const db = await Database.rawQuery("select array['2022-07-28'::date]")
console.log(db.rows)
I get this: [ { array: [ 2022-07-27T23:00:00.000Z ] } ].
use moment library
example :
moment(new Date()).format("YYYY-MM-DD HH:mm:ss")

Lift load Dateformat issue from csv file

we are migrating db2 data to db2 on cloud. We are using below lift cli operation for migration.
Extracting a database table to a CSV file using lift extract from source database.
Then loading the extracted CSV file to db2 on cloud using 'lift load'
ISSUE:
We have created some tables using ddl on the target db2oncloud which have some columns with DATA TYPE "TIMESTAMP"
while load operation(lift load), we are getting below error"
"MESSAGE": "The field in row \"2\", column \"8\" which begins with
\"\"2018-08-08-04.35.58.597660\"\" does not match the user specified
DATEFORMAT, TIMEFORMAT, or TIMESTAMPFORMAT. The row will be
rejected.", "SQLCODE": "SQL3191W"
If you use db2 as a source database, then use either:
the following property during export (to export dates, times, timestamps as usual for db2 utilities - without double quotes):
source-database-type=db2
try to use the following property during load, if you have already
exported timestamps surrounded by double quotes:
timestamp-format="YYYY-MM-DD-HH24.MI.SS.FFFFFF"
If the data was extracted using lift extract then for sure you should load the data with source-database-type=db2. Using this parameter will preconfigure all the necessary load details automatically.

Dash DB support for timestamp with time zone

I want to store timestamp with time zone in dashdb. I see that db2 supports data type TIMESTAMP WITH TIME ZONE for this purpose. But when I am trying to create table in dash db using this type, I am getting below error. Is this because dash db just doesn't support this type, even though db2 does. Or is there something I am doing wrong? Appreciate the help
DDL failed with message
_ Exception. _ state = 42601; error code = -104; error Message = Error for >batch element #1: An unexpected token "time" was found following "COL2" >timestamp with". Expected tokens may include: "REFERENCES".. _CODE=-104, >_STATE=42601, DRIVER=3.66.46

Using the Time data type in Postgres with a Microsoft Access Front-end

I have a field in my postgres database using the time (without time zone) data type. I have a Microsoft Access front-end for the database connected using psqlODBC, which reads this field as a "Date/Time" data type.
If I try to insert something into the field through the front end, I get the following error:
ODBC - insert on a linked table "table_name" failed.
ERROR: column "column_name" is of type time without time zone but expression is of type date;
I'm assuming that access is trying to input a time stamp instead.
Basically my question is it even really possible to use the time data type with Access? Or should I just be using the timestamp datatype instead?
If you are manually typing data into a linked table then no this won't be possible at present, if you have the option of updating your table via forms or VB then you could try this to get access to produce only a time value:
TimeSerial(Hour(Now()), Minute(Now()), Second(Now()))
Otherwise as you say, it's probably a good idea to change your data type to timestamp.