How to set 'int' datatype for a column with n/a values - postgresql

I have a table in postgres with column 'col' with age values. This column also contains n/a values.
When I am applying a condition of age < 15, I am getting below error:
[Code: 0, SQL State: 22P02] ERROR: invalid input syntax for integer: "n/a"
I am using below query to handle the n/a values but I am still getting the same error:
ALTER TABLE tb
ADD COLUMN col CHARACTER VARYING;
UPDATE tb
Set col =
CASE
WHEN age::int <= 15
THEN 'true'
ELSE 'false'
END
;'
Please see 'age' is in text format in my table. I have two questions here:
How can I set the datatype while creating the initial table (in the create table statement)?
How can I handle n/a values in above case statement?
Thanks

You should really fix your data model and store numbers in integer columns.
You can get around you current problem, by converting your invalid "numbers" to null:
UPDATE tb
Set col = CASE
WHEN nullif(age, 'n/a')::int <= 15 THEN 'true'
ELSE 'false'
END;
And it seems col should be a boolean rather than a text column as well.

Related

Divide a value in JSON using postgreSQL

Im relatively new and would like to redenominate some values in my current database. This means going into my jasonb column in my database, selecting a key value and dividing it by a 1000. I know how to select values but update after I have performed a calculation has failed me. My table name is property_calculation and has two columns as follows: * dynamic_fields is my jasonb column
ID
dynamic_fields
1
{"totalBaseValue": 4198571.230720645844841865113039602874778211116790771484375,"surfaceAreaValue": 18.108285497586717127660449477843940258026123046875,"assessedAnnualValue": 1801819.534798908603936834409607936611000776616631213755681528709828853607177734375}
2
{"totalBaseValue": 7406547.28939837918763178237213651300407946109771728515625,"surfaceAreaValue": 31.94416993248973568597648409195244312286376953125,"assessedAnnualValue": 9121964.022681592442116216621222042691512210677018401838722638785839080810546875}
I would like to update the dynamic_fields.totalBaseValue by dividing it by 1000 and committing it back as the new value. I have tried the following with no success:
update property_calculation
set dynamic_fields = (
select jsonb_agg(case
when jsonb_typeof(elem -> 'totalBaseValue') = 'number'
then jsonb_set(elem, array['totalBaseValue'], to_jsonb((elem ->> 'totalBaseValue')::numeric / 1000))
else elem
end)
from jsonb_array_elements(dynamic_fields::jsonb) elem)::json;
I get the following error:
ERROR: cannot extract elements from an object
SQL state: 22023
My json column has no zero string or null values.
Move the jsonb_typeof() check into the where clause:
update property_calculation
set dynamic_fields =
jsonb_set(
dynamic_fields,
'{totalBaseValue}',
to_jsonb((dynamic_fields->>'totalBaseValue')::numeric / 1000)
)
where jsonb_typeof(dynamic_fields->'totalBaseValue') = 'number';
db<>fiddle here

Data type error using case statement from Varchar(Max) source

I have a table with data that ranges thru these values.
-0.0011086463928222656,
0,
9.318138472735882e-5,
NA
The data type of the source is VARCHAR(MAX).
I am developing the SELECT statement below, to insert into another staging table for obvious reasons.
SELECT CASE
WHEN air_wpa = 'NA' THEN 0.0
WHEN air_wpa = 0.0 THEN 0.00
ELSE CONVERT(NUMERIC(18, 9), air_wpa)
END AS air_wpa
FROM Table
I get this error.
Error converting data type varchar to numeric.
Is there a better way to do this?
When you compare air_wpa = 0.0 in the CASE expression, the Data type precedence rules are applied, which say that when comparing a varchar column with a decimal constant, all values in the column (except those that match the first WHEN clause) are converted to the decimal data type to perform the comparison.
To avoid that conversion, use a varchar constant in the comparison. Additionally, if you use SQL Server 2012 or later, you can use TRY_CONVERT to get a NULL value (instead of an error) when you encounter a value which cannot be converted to NUMERIC(18,9).
SELECT CASE
WHEN air_wpa = 'NA' THEN 0.0
WHEN air_wpa = '0.0' THEN 0.00
ELSE TRY_CONVERT(NUMERIC(18, 9), air_wpa)
END 'air_wpa'
FROM TheTable

How to change date format of a column based on regex in PostgreSQL11.0

I have a table in PostgreSQL 11.0 with following column with date (column type: character varying).
id date_col
1 April2006
2 May2005
3 null
4
5 May16,2019
I would like to convert the column to 'date'
As there are two different date format, I am using a CASE statement to alter the column type based on a date pattern.
select *,
case
when date_col ~ '^[A-Za-z]+\d+,\d+' then alter table tbl alter date_col type date using to_date((NULLIF(date_col , 'null')), 'MonthDD,YYYY')
when date_col ~ '^[A-Za-z]+,\d+' then alter table tbl alter date_col type date using to_date((NULLIF(date_col, 'null')), 'MonthYYYY')
else null
end
from tbl
I am getting following error:
[Code: 0, SQL State: 42601] ERROR: syntax error at or near "table"
Position: 93 [Script position: 93 - 98]
The expected output is:
id date_col
1 2006-04-01
2 2005-05-01
3 null
4
5 2019-05-16
Any help is highly appreciated!!
You definitely can't alter a column one row at a time. Your better bet is to update the existing values so they are all the same format, then issue a single ALTER statement.

Check if character varying is between range of numbers

I hava data in my database and i need to select all data where 1 column number is between 1-100.
Im having problems, because i cant use - between 1 and 100; Because that column is character varying, not integer. But all data are numbers (i cant change it to integer).
Code;
dst_db1.eachRow("Select length_to_fault from diags where length_to_fault between 1 AND 100")
Error - operator does not exist: character varying >= integer
Since your column supposed to contain numeric values but is defined as text (or version of text) there will be times when it does not i.e. You need 2 validations: that the column actually contains numeric data and that it falls into your value restriction. So add the following predicates to your query.
and length_to_fault ~ '^\+?\d+(\.\d*)?$'
and length_to_fault::numeric <# ('[1.0,100.0]')::numrange;
The first builds a regexp that insures the column is a valid floating point value. The second insures the numeric value fall within the specified numeric range. See fiddle.
I understand you cannot change the database, but this looks like a good place for a check constraint esp. if n/a is the only non-numeric are allowed. You may want to talk with your DBA ans see about the following constraint.
alter table diags
add constraint length_to_fault_check
check ( lower(length_to_fault) = 'n/a'
or ( length_to_fault ~ '^\+?\d+(\.\d*)?$'
and length_to_fault::numeric <# ('[1.0,100.0]')::numrange
)
);
Then your query need only check that:
lower(lenth_to_fault) != 'n/a'
The below PostgreSQL query will work
SELECT length_to_fault FROM diags WHERE regexp_replace(length_to_fault, '[\s+]', '', 'g')::numeric BETWEEN 1 AND 100;

How to insert value into uuid column in Postgres?

I have a table with a uuid column, and some of the rows are missing the data. I need to insert data into this uuid column. The data is entered manually, so we are suffixing with other column data to differentiate, but it gives me an error.
UPDATE schema.table
SET uuid_column = CONCAT ('f7949f56-8840-5afa-8c6d-3b0f6e7f93e9', '-', id_column)
WHERE id_column = '1234';
Error: [42804] ERROR: column "uuid_column" is of type uuid but expression is of type text
Hint: You will need to rewrite or cast the expression.
Position: 45
I also tried
UPDATE schema.table
SET uuid_column = CONCAT ('f7949f56-8840-5afa-8c6d-3b0f6e7f93e9', '-', id_column)::uuid
WHERE id_column = '1234';
Error: [22P02] ERROR: invalid input syntax for uuid: "f7949f56-8840-5afa-8c6d-3b0f6e7f93e9-1234"
An UUID consists of 16 bytes, which you see displayed in hexadecimal notation.
You cannot have a UUID with fewer or more bytes.
I recommend using the type bytea if you really need to do such a thing.