Is there any way to add a constraint on a column that is an array to limit length text objects?
I know that I can do this without constraint:
colA varchar(100)[] not null
I tried to do it in the following way:
alter table "tableA" ADD CONSTRAINT "colA_text_size"
CHECK ((SELECT max(length(pc)) from unnest(colA) as pc) <= 100) NOT VALID;
alter table "tableA" VALIDATE CONSTRAINT colA_text_size;
But got error: cannot use subquery in check constraint (SQLSTATE 0A000)
Try the following definition for your check constraint: (see demo, for demo I limit length to 25).
check (length(replace(array_to_string( text_array ,','), ',','')) <= 100)
What it does:
First the function array_to_string( ... ) converts the array to a csv.
The replace() function then removes the commas replacing them with the zero length string ''.
The length() function gets number of remaining characters in the string.
Finally that number is compared to the limit value (100) and the check constraint is either passed of failed.
References:
array_to_string(),
replace(), length()
Related
We have a table with 2 columns (both have the same type and size) and 2 constraints for them:
create table colors
(
color varchar(6)
constraint color_check check
((color)::text ~ '^[0-9a-fA-F]{6}$'::text),
color_secodandry varchar(6)
constraint color_secondary_check check
((color_secodandry)::text ~ '^[0-9a-fA-F]{6}$'::text),
);
In case of inserts with long values:
insert into colors (color, color_secondary) values ('ccaabb', 'TOO_LONG_TEXT');
insert into colors (color, color_secondary) values ('TOO_LONG_TEXT', 'ccaabb');
we'll get the same errors for two error cases:
ERROR: value too long for type character varying(6) (SQLSTATE 22001)
PostgreSQL validates length for that columns before make inserts, so our checks never run. Is there a way to understand, which column has an invalid data?
The issue you are having is the order of evaluation for the intended values. You told Postgres to not allow a length over 6 (character varying(6)) you also specified additional certain criteria those values have to satisfy. What is happening is Postgres validates the length criteria and throws an exception when the value fails, in that case the check constraint is not preformed as Postgres works on an exit on first failure. The check constraint is processed only after the length passes. Example:
create table test1( id integer generated always as identity
, color6 character varying (6)
constraint color6_check check (color6 ~ '^[0-9a-fA-F]{6}$')
, color60 character varying (60)
constraint color60_check check (color60 ~ '^[0-9a-fA-F]{6}$')
) ;
insert into test1( color6 ) values ('aabbccdd') ;
/* Result
SQL Error [22001]: ERROR: value too long for type character varying(6)
ERROR: value too long for type character varying(6)
*/
insert into test1( color60 ) values ('aabbccdd') ;
/* Result
SQL Error [23514]: ERROR: new row for relation "test1" violates check constraint "color60_check"
Detail: Failing row contains (3, null, aabbccdd).
ERROR: new row for relation "test1" violates check constraint "color60_check"
*/
Notice the only difference between them is the length specification for the column being inserted. Yet they fail, but for a different reasons. Since both the length specification and the check constraint enforce the length you need to decide now how you want to handle the 2 conditions: a separate error for each condition or a single error for both. (IMHO: separate messages)
I hava data in my database and i need to select all data where 1 column number is between 1-100.
Im having problems, because i cant use - between 1 and 100; Because that column is character varying, not integer. But all data are numbers (i cant change it to integer).
Code;
dst_db1.eachRow("Select length_to_fault from diags where length_to_fault between 1 AND 100")
Error - operator does not exist: character varying >= integer
Since your column supposed to contain numeric values but is defined as text (or version of text) there will be times when it does not i.e. You need 2 validations: that the column actually contains numeric data and that it falls into your value restriction. So add the following predicates to your query.
and length_to_fault ~ '^\+?\d+(\.\d*)?$'
and length_to_fault::numeric <# ('[1.0,100.0]')::numrange;
The first builds a regexp that insures the column is a valid floating point value. The second insures the numeric value fall within the specified numeric range. See fiddle.
I understand you cannot change the database, but this looks like a good place for a check constraint esp. if n/a is the only non-numeric are allowed. You may want to talk with your DBA ans see about the following constraint.
alter table diags
add constraint length_to_fault_check
check ( lower(length_to_fault) = 'n/a'
or ( length_to_fault ~ '^\+?\d+(\.\d*)?$'
and length_to_fault::numeric <# ('[1.0,100.0]')::numrange
)
);
Then your query need only check that:
lower(lenth_to_fault) != 'n/a'
The below PostgreSQL query will work
SELECT length_to_fault FROM diags WHERE regexp_replace(length_to_fault, '[\s+]', '', 'g')::numeric BETWEEN 1 AND 100;
Wanted to create the multiple parameter of function but it gives me this error:
CREATE FUNCTION failed because a column name is not specified for
column 1.
Code below:
create function dmt.Impacted(
#nameOfColumn varchar , #nameOfParam varchar)
returns table
as
return
(select
case when '['+#nameOfColumn+']' is null or len(rtrim('['+#nameOfColumn+']')) = 0
then Convert(nvarchar(2),0)
else
#nameOfParam end from employee) ;
As the error message clearly said, the column in the returned result need a name. Either give it an alias in the SELECT like
SELECT CASE
...
END a_column_name
...
or define it in the declaration of the return type as in
...
RETURNS TABLE
(a_column_name nvarchar(max)
...
As you can see in the second form you have to specify a data type. As your current code doesn't make much sense now I cannot figure out what is the right one there. You'd need to amend it.
Note, that len(rtrim('['+#nameOfColumn+']')) = 0 is never true as len(rtrim('['+#nameOfColumn+']')) is either NULL, when #nameOfColumn is NULL or at least 2 because of the added brackets.
If #nameOfColumn is supposed to be a column name you shouldn't use varchar (especially without a length specified for it) but sysname which is a special type for object names.
Either way you should define a length for #nameOfColumn and #nameOfParam as just varchar without any length means varchar(1), which is probably not what you want. And maybe instead of varchar you want nvarchar.
You may also want to look into quotename().
Define name of column in SELECT statement :
(select case when '['+#nameOfColumn+']' is null or
len(rtrim('['+#nameOfColumn+']')) = 0
then Convert(nvarchar(2),0)
else #nameOfParam
end as name_column -- define column name
from employee)
Also, your function parameter has no data length, by default it will accept only 1 character #nameOfColumn varchar , #nameOfParam varchar & rest will trim.
I'm trying to insert a alphanumeric value in a table:
INSERT INTO solution (solution, nextsolution) VALUES
('9Na_(2)SO_(4)', NULL), ('2Ni(OH)_(3)', (SELECT id FROM solution WHERE solution='9Na_(2)SO_(4)' & nextsolution=null));
solution is of type text and nextsolution is an integer. Unfortunately postgresql doesn't allow me to do the WHERE clause. It gives me the error:
ERROR: invalid input syntax for integer: "9Na_(2)SO_(4)"
LINE 9: ...OH)_(3)', (SELECT id FROM solution WHERE solution='9Na_(2)SO...
How can I solve this?
The issue is that the statement in the where clause: '9Na_(2)SO_(4)' & nextsolution=null tries to do a bitwise and (&) operation on the string and this won't work (and probably isn't what you want anyway).
Looking at your query I think what you want is to first insert the value '9Na_(2)SO_(4)' and then the value '2Ni(OH)_(3)' with the id of the previous inserted row.
You need to do this as two statements and use a different syntax. This should do what you want:
INSERT INTO solution (solution, nextsolution) VALUES (
'9Na_(2)SO_(4)',
NULL
);
INSERT INTO solution (solution, nextsolution) VALUES (
'2Ni(OH)_(3)',
(SELECT id FROM solution WHERE solution='9Na_(2)SO_(4)' and nextsolution is null)
);
You need to use AND instead of & to join your WHERE clause - an ampersand (&) is used for bitwise operations.
I have a query in postgres
insert into c_d (select * from cd where ak = '22019763');
And I get the following error
ERROR: column "region" is of type integer but expression is of type character varying
HINT: You will need to rewrite or cast the expression.
An INSERT INTO table1 SELECT * FROM table2 depends entirely on order of the columns, which is part of the table definition. It will line each column of table1 up with the column of table2 with the same order value, regardless of names.
The problem you have here is whatever column from cd with the same order value as c_d of the table "region" has an incompatible type, and an implicit typecast is not available to clear the confusion.
INSERT INTO SELECT * statements are stylistically bad form unless the two tables are defined, and will forever be defined, exactly the same way. All it takes is for a single extra column to get added to cd, and you'll start getting errors about extraneous extra columns.
If it is at all possible, what I would suggest is explicitly calling out the columns within the SELECT statement. You can call a function to change type within each of the column references (or you could define a new type cast to do this implicitly -- see CREATE CAST), and you can use AS to set the column label to match that of your target column.
If you can't do this for some reason, indicate that in your question.
Check out the PostgreSQL insert documentation. The syntax is:
INSERT INTO table [ ( column [, ...] ) ]
{ DEFAULT VALUES | VALUES ( { expression | DEFAULT } [, ...] ) | query }
which here would look something like:
INSERT INTO c_d (column1, column2...) select * from cd where ak = '22019763'
This is the syntax you want to use when inserting values from one table to another where the column types and order are not exactly the same.