I have a table with this particular date column in PostgreSQL, I need to retrieve the all record based on one specific Id using Slick 2.0, I'm using Joda to manage the dates
MyTable
(
IdTable Int NOT NULL,
Name varchar(64),
Created_Date timestamp with time zone DEFAULT now()
)
Then I try to map it in Slick in this way:
val Created_Date: Column[Option[DateTime]] = columnOption[DateTime]
As soon as I add the Created_Date column in my table, the method to retrieve the records fails. What's the right way to map a datetime with time zone in Slick using Joda? Any recomendation?
To use a db type with a scala type, you need define a type mapper, which was used to convert between db raw data and scala object.
You can define yours by using slick MappedColumnType, since timestamp was already built-in supported by slick.
Here's the guide content for slick 2.0.3
But you can also employ other existing libraries, like: slick-pg, slick-joda-mapper.
I've been successful retrieving timestamp with time zone columns in Slick 2.x using:
def timestamp = column[Timestamp]("timestamp", O.Default(null))
Related
I have a PostgresSQL table that has a Date column of type TEXT
Values look like this: 2019-07-19 00:00
I want to either cast this column to type DATE so I can query based on latest values, etc.... or create a new column of type DATE and cast into there (so I have both for the future). I would appreciate any advice on both options!
Hope this isn't a dupe, but I havn't found any answers on SO.
Some context, I will need to add more data later on to the table that only has the TEXT column, which is why i want to keep the original but open to suggestions.
You can alter the column type with the simple command:
alter table my_table alter my_col type date using my_col::date
This seems to be the best solution as maintaining duplicate columns of different types is a potential source of future trouble.
Note that all values in the column have to be null or be recognizable by Postgres as a date, otherwise the conversion will fail.
Test it in db<>fiddle.
However, if you insist on creating a new column, use the update command:
alter table my_table add my_date_col date;
update my_table
set my_date_col = my_col::date;
Db<>fiddle.
I want to create column with the following definition
updated_at TIMESTAMP NOT NULL DEFAULT NOW() ON UPDATE now(),
But when I type in schema.prisma
updated_at DateTime #updatedAt #db.DateTime(0)
Then I obtains table with column:
`updated_at` datetime NOT NULL,
How to add ON UPDATE now() to this column using prisma? I use MySQL/MariaDB.
References:
Connected problem
https://github.com/prisma/prisma/issues/5799#issuecomment-894631317
Article about ON UPDATE
https://medium.com/#bengarvey/use-an-updated-at-column-in-your-mysql-table-and-make-it-update-automatically-6bf010873e6a
Based on the client reference it seems that the client itself is responsible for providing the updated at DATETIME value instead of the way you're wanting it (to be done in the DB server).
If you really need the table/column definition to be exactly as you want, I would suggest creating or modifying the table directly in SQL and then using introspect to import the definition in your schema.
You could even add the definitions to your prisma migration files as these are just plain sql files. How to do this can be found in the Customizing Migrations article
The following SQL Query:
CREATE TABLE "SomeTable" ("dateEnd" DATE)
Creates a table SomeTable with a column dateEnd. However, the database-type is Timestamp, not Date. It used to work, but after reimporting a whole database dump, all the Date data-types are replaced by Timestamp data-types. Even If I create a very simple table, like the one above, the data-type jumps to Timestamp. I am using DB2 express c version 11.1.0.
If your Db2 database was created in Oracle Compatibility mode, then DATE columns are implemented as TIMESTAMP(0) columns to match what Oracle does.
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.1.0/com.ibm.db2.luw.apdv.porting.doc/doc/r0053667.html
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.1.0/com.ibm.db2.luw.admin.config.doc/doc/r0054912.html
BTW you may want to use either Db2 Developer-C or Db2 Developer Community Edition. Those are effectively replacing the old Express-C edition
https://www.ibm.com/uk-en/marketplace/ibm-db2-direct-and-developer-editions
I'm trying to determine whether or not postgresql keeps internal (but accessible via a query) sequential record ids and / or record creation dates.
In the past I have created a serial id field and a record creation date field, but I have been asked to see if Postgres already does that. I have not found any indication that it does, but I might be overlooking something.
I'm currently using Postgresql 9.5, but I would be interested in knowing if that data is kept in any version.
Any help is appreciated.
Thanks.
No is the short answer.
There is no automatic timestamp for rows in PostgreSQL.
You could create the table with a timestamp with a default.
create table foo (
foo_id serial not null unique
, created_timestamp timestamp not null
default current_timestamp
) without oids;
So
insert into foo values (1);
Gives us
You could also have a modified_timestamp column, which you could
update with an after update trigger.
Hope this helps
Change data type varchar to timestamp along with null values in PostgreSQL
I have a column with empty rows and few with timestamp rows. How to convert that into timestamp data type in PostgreSQL?
You need a USING clause to turn the empty strings into null.
ALTER TABLE ...
ALTER COLUMN mycol
TYPE timestamp
USING (...conversion expression...)
Without seeing the input data it's hard to say exactly what that expression must be, but it probably involves nullif or case expressions and the to_timestamp function and/or a CAST to timestamp.