Need to apply CHECK constraint and length check on number in SQL - postgresql

I need to have a CHECK constraint on a datatype, which has to have a format of 010000 through 129999 with the zero preserved, but I don't know how to achieve this. Basically, as evident, it's a numeric month-year.
I have tried using numeric(6,0) and integer, but I don't know how to use a CHECK that preserves the leading zero.
I also don't know how I could achieve this more easily using character varying(6) and it's not preferred either, as I think it'll be harder to use in the application layer.
Any suggestions? I'm using Postgres.

Three ways (there may be more):
-- (1) use a date type for a date
CREATE TABLE mmyyyy
( id SERIAL NOT NULL PRIMARY KEY
, yyyymm01 DATE NOT NULL CHECK (date_trunc('month', yyyymm01) = yyyymm01)
);
INSERT INTO mmyyyy(yyyymm01) VALUES
('1901-01-01') ,('0001-01-01') ,('2016-02-01') ;
INSERT INTO mmyyyy(yyyymm01) VALUES ('1901-13-01') ; -- should fail
INSERT INTO mmyyyy(yyyymm01) VALUES ('2016-02-13') ; -- should fail
SELECT id, to_char(yyyymm01, 'mmyyyy') AS this FROM mmyyyy ;
-- (2) use a char type and apply the check on the cast_to_int result
CREATE TABLE omg
( id SERIAL NOT NULL PRIMARY KEY
, mmyyyy varchar(6) NOT NULL CHECK (
length(mmyyyy) = 6 AND
left(mmyyyy,2)::integer BETWEEN 1 AND 12)
);
INSERT INTO omg(mmyyyy) VALUES ('011901') ,('010001') ,('022016') ;
INSERT INTO omg(mmyyyy) VALUES ('131901') ; -- should fail
INSERT INTO omg(mmyyyy) VALUES ('002016') ; -- should fail
SELECT id, mmyyyy FROM omg ;
-- (3) use an int type and apply the check to the value/10000
CREATE TABLE wtf
( id SERIAL NOT NULL PRIMARY KEY
, mmyyyy INTEGER NOT NULL CHECK (
mmyyyy/10000 BETWEEN 1 AND 12)
);
INSERT INTO wtf(mmyyyy) VALUES
(11901) ,(10001) ,(22016)
;
INSERT INTO wtf(mmyyyy) VALUES (131901) ; -- should fail
INSERT INTO wtf(mmyyyy) VALUES (2016) ; -- should fail
SELECT id, to_char(mmyyyy, '099999') AS mmyyyy
FROM wtf
;
-- (extra) use an date/char/int type AS the baseclass for a domain(or type):
-- (this can come in handy if the "type" is used in more than one place)
CREATE DOMAIN omgwtf AS
INTEGER CHECK ( value/10000 BETWEEN 1 AND 12)
;
CREATE TABLE tralala
( id SERIAL NOT NULL PRIMARY KEY
, mmyyyy omgwtf NOT NULL
);
INSERT INTO tralala(mmyyyy) VALUES
(11901) ,(10001) ,(22016)
;
INSERT INTO tralala(mmyyyy) VALUES (131901) ; -- should fail
INSERT INTO tralala(mmyyyy) VALUES (2016) ; -- should fail
SELECT id, to_char(mmyyyy, '099999') AS mmyyyy
FROM tralala
;
The output:
CREATE TABLE
INSERT 0 3
ERROR: date/time field value out of range: "1901-13-01"
LINE 1: INSERT INTO mmyyyy(yyyymm01) VALUES ('1901-13-01') ;
^
HINT: Perhaps you need a different "datestyle" setting.
ERROR: new row for relation "mmyyyy" violates check constraint "mmyyyy_yyyymm01_check"
DETAIL: Failing row contains (4, 2016-02-13).
id | this
----+--------
1 | 011901
2 | 010001
3 | 022016
(3 rows)
CREATE TABLE
INSERT 0 3
ERROR: new row for relation "omg" violates check constraint "omg_mmyyyy_check"
DETAIL: Failing row contains (4, 131901).
ERROR: new row for relation "omg" violates check constraint "omg_mmyyyy_check"
DETAIL: Failing row contains (5, 002016).
id | mmyyyy
----+--------
1 | 011901
2 | 010001
3 | 022016
(3 rows)
CREATE TABLE
INSERT 0 3
ERROR: new row for relation "wtf" violates check constraint "wtf_mmyyyy_check"
DETAIL: Failing row contains (4, 131901).
ERROR: new row for relation "wtf" violates check constraint "wtf_mmyyyy_check"
DETAIL: Failing row contains (5, 2016).
id | mmyyyy
----+---------
1 | 011901
2 | 010001
3 | 022016
(3 rows)
CREATE DOMAIN
CREATE TABLE
INSERT 0 3
ERROR: value for domain omgwtf violates check constraint "omgwtf_check"
ERROR: value for domain omgwtf violates check constraint "omgwtf_check"
id | mmyyyy
----+---------
1 | 011901
2 | 010001
3 | 022016
(3 rows)

I ended up using a yyyymm format, as suggested by #lad2025.

Related

How do I update a value within a table that has a constraint?

Example Table: id_rel
id | other_id
-----------
1 | 123
-----------
2 | 456
-----------
3 | 123
There is a constraint on columns id, and other_id. The table is a relation table. I'd like to update all '123' values to '456' which already exist in the table. I've tried something as simple as:
UPDATE id_rel
SET other_id = 456
WHERE other_id = 123;
When I try the above I get a message like the following error:
ERROR: duplicate key value violates unique constraint "id_rel" Detail: Key (id, other_id)=(1, 456) already exists.
How can I change these values without having to remove the restraints and basically rebuild the table?
The key "456" as an unique constraint and this constraint allready exist for a another record.
You have to merge the two record or delete the one who occupy the constraint value

insert multiple rows into table with column that has default value

I have a table in PostgreSQL and one of the column has default value.
DDL of the table is:
CREATE TABLE public.my_table_name
(int_column_1 character varying(6) NOT NULL,
text_column_1 character varying(20) NOT NULL,
text_column_2 character varying(15) NOT NULL,
default_column numeric(10,7) NOT NULL DEFAULT 0.1,
time_stamp_column date NOT NULL);
I am trying to insert multiple rows in a single query. And in those I have some rows to which I have value for default_column and i have some rows to which i don't have any value for default_column and want to Postgres to use default value for these rows.
Here's what i tried:
INSERT INTO "my_table_name"(int_column_1, text_column_1, text_column_2, default_column, time_stamp_column)
VALUES
(91,'text_row_11','text_row_21',8,current_timestamp),
(91,'text_row_12','text_row_22',,current_timestamp),
(91,'text_row_13','text_row_23',19,current_timestamp),
(91,'text_row_14','text_row_24',,current_timestamp),
(91,'text_row_15','text_row_25',27,current_timestamp);
this gives me an error. So, when i try to insert:
INSERT INTO "my_table_name"(int_column_1, text_column_1, text_column_2, default_column, time_stamp_column)
VALUES (91,'text_row_12','text_row_22',,current_timestamp), -- i want null to be appended here, so i left it empty.
--error from this query is: ERROR: syntax error at or near ","
and
INSERT INTO "my_table_name"(int_column_1, text_column_1, text_column_2, default_column, time_stamp_column)
VALUES (91,'text_row_14','text_row_24',NULL,current_timestamp),
-- error from this query is: ERROR: new row for relation "glycemicindxdir" violates check constraint "food_item_check"
So, how do i fix this; And insert value when i have it or have Postgres insert default when I don't have a value?
Use the default keyword:
INSERT INTO my_table_name
(int_column_1, text_column_1, text_column_2, default_column, time_stamp_column)
VALUES
(91, 'text_row_11', 'text_row_21', 8 , current_timestamp),
(91, 'text_row_12', 'text_row_22', default, current_timestamp),
(91, 'text_row_13', 'text_row_23', 19 , current_timestamp),
(91, 'text_row_14', 'text_row_24', default, current_timestamp),
(91, 'text_row_15', 'text_row_25', 27 , current_timestamp);

unique date field postgresql default value

I have a date column which I want to be unique once populated, but want the date field to be ignored if it is not populated.
In MySQL the way this is accomplished is to set the date column to "not null" and give it a default value of '0000-00-00' - this allows all other fields in the unique index to be "checked" even if the date column is not populated yet.
This does not work in PosgreSQL because '0000-00-00' is not a valid date, so you cannot store it in a date field (this makes sense to me).
At first glance, leaving the field nullable seemed like an option, but this creates a problem:
=> create table uniq_test(NUMBER bigint not null, date DATE, UNIQUE(number, date));
CREATE TABLE
=> insert into uniq_test(number) values(1);
INSERT 0 1
=> insert into uniq_test(number) values(1);
INSERT 0 1
=> insert into uniq_test(number) values(1);
INSERT 0 1
=> insert into uniq_test(number) values(1);
INSERT 0 1
=> select * from uniq_test;
number | date
--------+------
1 |
1 |
1 |
1 |
(4 rows)
NULL apparently "isn't equal to itself" and so it does not count towards constraints.
If I add an additional unique constraint only on the number field, it checks only number and not date and so I cannot have two numbers with different dates.
I could select a default date that is a 'valid date' (but outside working scope) to get around this, and could (in fact) get away with that for the current project, but there are actually cases I might be encountering in the next few years where it will not in fact be evident that the date is a non-real date just because it is "a long time ago" or "in the future."
The advantage the '0000-00-00' mechanic had for me was precisely that this date isn't real and therefore indicated a non-populated entry (where 'non-populated' was a valid uniqueness attribute). When I look around for solutions to this on the internet, most of what I find is "just use NULL" and "storing zeros is stupid."
TL;DR
Is there a PostgreSQL best practice for needing to include "not populated" as a possible value in a unique constraint including a date field?
Not clear what you want. This is my guess:
create table uniq_test (number bigint not null, date date);
create unique index i1 on uniq_test (number, date)
where date is not null;
create unique index i2 on uniq_test (number)
where date is null;
There will be an unique constraint for not null dates and another one for null dates effectively turning the (number, date) tuples into distinct values.
Check partial index
It's not a best practice, but you can do it such way:
t=# create table so35(i int, d date);
CREATE TABLE
t=# create unique index i35 on so35(i, coalesce(d,'-infinity'));
CREATE INDEX
t=# insert into so35 (i) select 1;
INSERT 0 1
t=# insert into so35 (i) select 2;
INSERT 0 1
t=# insert into so35 (i) select 2;
ERROR: duplicate key value violates unique constraint "i35"
DETAIL: Key (i, (COALESCE(d, '-infinity'::date)))=(2, -infinity) already exists.
STATEMENT: insert into so35 (i) select 2;

DB Validation on insert or update allow only one category based on order ID

I would like to add Database side validation to allow only one category based on Order ID using SQL_Constraints or Check constraint.
Table: order_line_table
Example allow to insert or update same category only based on order id
Id Order_id Categ_id
1 1 4
2 1 4
3 1 4
4 2 5
5 2 5
Example not allow to insert or update different category based on order id
Id Order_id Categ_id
6 3 4
7 3 5
I tried below code its working in server side. But using web service xmlrpc validation is not working.
#api.one
#api.constrains('order_line')
def _check_category(self):
list_categ = []
filter_categ = []
if self.order_line:
order_line_vals = self.order_line
for line_vals in order_line_vals:
for line in line_vals:
categ_id = line.categ_id and line.categ_id.id or False
list_categ.append(line.categ_id.id)
if isinstance(line, dict):
list_categ.append(line['categ_id'])
filter_categ = list(set(list_categ))
if len(filter_categ) > 1:
raise UserError(_('Only one product category is allowed!'))
At first I misunderstood your question, so I'm updating the answer.
To achieve your goal you could use EXCLUDE constraint in PostgreSQL:
CREATE TABLE order_line_table
(
Id SERIAL PRIMARY KEY,
Order_id INT,
Categ_id INT,
EXCLUDE USING GIST
(
Order_id WITH =,
Categ_id WITH <>
)
);
To support GIST index over <> operator you have to install an additional PostgreSQL extension btree_gist:
CREATE EXTENSION btree_gist;
Demo:
# INSERT INTO order_line_table (Order_id, Categ_id) VALUES (1, 2);
INSERT 0 1
# INSERT INTO order_line_table (Order_id, Categ_id) VALUES (1, 2);
INSERT 0 1
# INSERT INTO order_line_table (Order_id, Categ_id) VALUES (1, 3);
ERROR: conflicting key value violates exclusion constraint "orders_order_id_category_id_excl"
DETAIL: Key (Order_id, Categ_id)=(1, 3) conflicts with existing key (Order_id, Categ_id)=(1, 2).

How to fill a nullable integer column and convert it into a serial primary key in Postgresql?

My table contains an integer column (gid) which is nullable:
gid | value
-------------
0 |  a
| b
1 | c
2 | d
| e
Now I would like to change the gid column into a SERIAL primary key column. That means filling up the empty slots with new integers. The existing integers must remain in place. So the result should look like:
gid | value
-------------
0 |  a
3 | b
1 | c
2 | d
4 | e
I just can't figure out the right SQL command for doing the transformation. Code sample would be appreciated...
A serial is "just" a column that takes it default value from a sequence.
Assuming your table is named n1000 then the following will do what you want.
The first thing you need to do is to create that sequence:
create sequence n1000_gid_seq;
Then you need to make that the "default" for the column:
alter table n1000 alter column gid set default nextval('n1000_gid_seq');
To truly create a "serial" you also need to tell the sequence that it is associated with the column:
alter sequence n1000_gid_seq owned by n1000.gid;
Then you need to advance the sequence so that the next value doesn't collide with the existing values:
select setval('n1000_gid_seq', (select max(gid) from n1000), true);
And finally you need to update the missing values in the table:
update n1000
set gid = nextval('n1000_gid_seq')
where gid is null;
Once this is done, you can define the column as the PK:
alter table n1000
add constraint pk_n1000
primary key (gid);
And of course if you have turned off autocommit you need to commit all this.