how to enter dates in postgresql - postgresql

I'm trying to create a table to import data from Kaggle (https://www.kaggle.com/sulianova/cardiovascular-disease-dataset) using SQL Shell. I'm having problems with the date import.
I've altered the date to the correct format yyyy-mm-dd in excel and save it as .csv but when I try to copy in the data (using https://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/ as a guide) its recognising it as an integer. How can I overcome this?
I know I can enter a date in inverted commas but I cant do that manually for 70K entries.

I usually have this 'yyyy-MM-ddThh:mm:ssZ' and it works pretty fine for timestamp. Be careful with the 24H and 12H format.

Related

Reformat Unix timestamp on MongoImport with --columnsHaveTypes?

I'm trying to import data from a large CSV into MongoDB using MongoImport. The date column type is giving me problems.
The CSV has the date in epoch time, but I want it saved into MongoDB was a normal date type.
I'm using --columnsHaveTypes with --fieldFile, but I can't figure out or find any answers anywhere for how to convert the date format on import.
The documentation makes it seem like I can use the Go Lang format of columnName.date(1136239445) (that's the reference time in the documentation), but that won't work. And I can't find any help in date_ms(<arg>) or date_oracle(<arg>).
As much as possible, this needs to be a hands-off operation because the large DB dump (SQLite3 format) will be automatically converted to CSV and imported to MongoDB without human input.
Thanks in advance!

Having an issue with ascertaining the format of a date

I've been tasked with getting data from an existing database table and transferring it into another. Everything is fine except the date in the original table is in a format that I don't recognise. It looks suspiciously like a unix timestamp but when converting it it seems to be coming out as the year 2727 or something.
Here is an example of what's in the existing table: 1424786878240
The matching date for this on the front end of the site is 24th February 2015. I cannot seem to find any correlation between this and the number in the database - and since I have no access to the original site code I am unable to determine how it's being converted.
If anyone recognises this date format / structure I would appreciate some help.

Tableau cannot recognize timestamp field in my log file

I am using Tableau 9.3 to do a preliminary data analysis on one of my log file, the log file is like below:
"199.72.81.55",01/Jul/1995:00:00:01,/history/apollo/,200,6245,Sat
As you can see, there is a datetime for timestamp
In Tableau, initially it is recognized as a string like below:
That's fine, I want to make the field into datetime, and Tableau seems failed on it:
Why? How do I fix it?
Thank you very much.
UPDATED: after applying the formula suggested below, Tableau still cannot recognize the timestamp, here is the screenshot:
UPDATED AGAIN: after tested by nick, it is confirmed his first script is correct and working on his Tableau, why it fails on mine, I don't know, you are welcome to share any clue please, thank you.
Tableau implicit conversions are limited to more standard formats. You can still create a DATETIME field from your timestamp string using a calculated field with the following formula:
DATEPARSE('dd/MMM/yyyy:HH:mm:ss',[timestamp])
Using the above will transform a string like 01/Jul/1995:00:00:01 to a date and time of 7/1/1995 12:00:01 AM
Output using example data:
Sometimes the "date parse" function in Tableau doesn't quite do the job.
When this happens it is worth testing manual string manipulation with your timestamp field to put it into ISO-standard format and only then trying to convert it into a date. ISO format is yyyy-mm-dd hh:mm:ss (eg 2012-02-28 13:04:30). It is common to find that the original string has spurious characters or spaces that throw dateparse. But these are usually easy to manipulate away with suitable text manipulations. This can sometimes be longwinded, but it always works.
It turned out to be the region setting issue, it works after I switch it to USA

Datastage String Conversion to Timestamp

I have been tasked with converting a string which can come in different formats like such(mm/dd/yyyy, m/dd/yyyy, or mm/d/yyyy). However, it needs to be converted into a Timestamp with the following format (yyyy-mm-dd-00.00.000000). I have tried multiple conversion techniques within the Transformer stage; however, I have been unsuccessful. Basically, I pull the data from a file and stage it into a file in the same format as the table. I then insert into the database using the second file.
The main issue I was running into was the fact that the format could be mm/dd/yyyy or m/d/yyyy. The solution to this was to add a ",s" to my StringToTimestamp function.
StringtoTimestamp(Input,,"%(m,s)/%(d,s)/%yyyy %hh:%nn:%ss").

Issues with importing data in date format from excel to oracle 10g?

hello I am trying to import date formatted data from excel to oarcle 10g databse using SQL developer, earlier I had no idea, but after revising various questions in stackoverflow I came to know that it can be worked using SQL developer,Here is the process, I tried to import using right click on the table in sql developer selecting the column names followed by DATE format to be mentioned for the columns that have date, but I am not succeeded. Atlast I verified whether all the columns have success in their status, I have success for all the columns but after finish the error is "The date should be between 1 and last day of the month, Can anyone let me now how to fix it please ?
sometimes it caused by the format from your excel file..make sure that your excel file has a right Date format. Like dd-mm-rr
You can try navicat software to import excel file into oracle