Postgres Alter table to convert column type from char to bigint [duplicate] - postgresql

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
how to change column datatype from character to numeric in postgresql 8.4
If I have a field of type varchar (and all the values are null or string representations of numbers) how do I use alter table to convert this column type to bigint?

To convert simply by parsing the string (casting):
alter table the_table alter column the_column type bigint using the_column::bigint
In fact, you can use any expression in terms of the_column instead of the_column::bigint to customise the conversion.
Note this will rewrite the table, locking out even readers until it's done.

You could create a temporary column of type bigint, and then execute SQL like
UPDATE my_table SET bigint_column=varchar_column::bigint;
Then drop your varchar_column and rename bigint_column. This is kinda roundabout, but will not require a custom cast in postgres.

How to convert a string column type to numeric or bigint in postgresql
Design your own custom cast from string to bigint. Something like this:
CREATE OR REPLACE FUNCTION convert_to_bigint(v_input text)
RETURNS BIGINT AS $$
DECLARE v_bigint_value BIGINT DEFAULT NULL;
BEGIN
BEGIN
v_bigint_value := v_input::BIGINT;
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'Invalid bigint value: "%". Returning something else.', v_input;
RETURN 0;
END;
RETURN v_bigint_value;
END;
Then create a new table fixed_table_with_bigint with the same parameters as the old table except change the string column into the bigint column.
Then insert all the rows from the previous table (using the custom cast convert_to_integer ) into the new table:
insert into fixed_table_with_bigint
select mycolumn1,
convert_to_bigint(your_string_bigint_column),
mycolumn3
from incorrect_table
You may have to modify convert_to_bigint in order to handle strings which are not numbers, blankstrings, nulls, control characters and other Weirdness.
Then delete the first table and rename the 2nd table as the first table.

Related

Cannot assign/cast PostgreSQL record with JSONB to an HSTORE column

I'm trying to create a trash table, where I can store deleted entities, and have the ability to restore them manually as needed. To make this more complicated, my tables have JSONB, HSTORE, GEOMETRY, and BYTEA columns.
Inserting is pretty simple:
insert into trash (original_table, original_id, content) select 'something', id, to_jsonb(something) from something where ...;
But HSTORE seems to cause problems when I'm trying to read data back from trash:
select t.* from trash, jsonb_to_record(content) as t(id bytea, created_at timestamp, attributes hstore, ...);
(content is a column of type JSONB.)
GEOMETRY, BYTEA, and JSONB columns get assigned/cast just fine. I know that JSONB cannot be automatically cast to HSTORE, so I created this CAST:
CREATE FUNCTION jsonb_to_hstore(j JSONB) RETURNS HSTORE IMMUTABLE STRICT LANGUAGE sql AS $$
SELECT hstore(array_agg(key), array_agg(value)) FROM jsonb_each_text(j)
$$;
CREATE CAST (JSONB AS HSTORE) WITH FUNCTION jsonb_to_hstore AS IMPLICIT;
But it doesn't seem to be used, and I still get this error when including the HSTORE column in t(...):
ERROR: Syntax error near '"' at position 11
Why is my cast not used by AS? Where can I find out more about the usage of the record type? jsonb_to_record documentation mentions that:
As with all functions returning record, the caller must explicitly define the structure of the record with an AS clause.
But I can't find anything more about this.

First values in an auto increment trigger

I am working with the following table in PostgreSQL 10.3:
CREATE TABLE s_etpta.tab1 (
Number VARCHAR(40) NOT NULL,
id VARCHAR(8),
CONSTRAINT i_tab1 PRIMARY KEY(Number)
)
I need to increment the column id by 1 with every insert. I can't alter the table because I'm not the owner so I have no other choice than to increment a varchar column.
The column is type varchar prefixed with zeros. How can I specify that I want to start with '00000001' if the table is empty? Because when I already have values in my table the trigger gets the last value and increment it for the next insert which is correct, but when my table is empty the id column stays empty since the trigger has no value to increment.
CREATE OR REPLACE FUNCTION schema."Num" (
)
RETURNS trigger AS
$body$
DECLARE
BEGIN
NEW.id := lpad(CAST(CAST(max (id) AS INTEGER)+1 as varchar),8, '0') from
schema.tab1;
return NEW;
END;
$body$
LANGUAGE 'plpgsql'
VOLATILE
RETURNS NULL ON NULL INPUT
SECURITY INVOKER
COST 100;
A trigger design is unsafe and expensive trickery that can easily fail under concurrent write load. Don't use a trigger. Use a serial or IDENTITY column instead:
Auto increment table column
Don't use text (or varchar) for a numeric value.
Don't pad leading zeroes. You can format your numbers any way you like for display with to_char():
How to auto increment id with a character
In Postgres 10 or later your table could look like this:
CREATE TABLE s_etpta.tab1 (
number numeric NOT NULL PRIMARY KEY, -- not VARCHAR(40)
id bigint GENERATED ALWAYS AS IDENTITY -- or just int?
);
No trigger.
Seems odd that number is the PK. Would seem like id should be. Maybe you do not need the id column in the table at all?
Gap-less sequence where multiple transactions with multiple tables are involved
If you need to get the underlying sequence in sync:
How to reset postgres' primary key sequence when it falls out of sync?
Postgres manually alter sequence
If you cannot fix your table, this trigger function works with the existing one (unreliably under concurrent write load):
CREATE OR REPLACE FUNCTION schema.tab1_number_inc()
RETURNS trigger AS
$func$
DECLARE
BEGIN
SELECT to_char(COALESCE(max(id)::int + 1, 0), 'FM00000000')
FROM schema.tab1
INTO NEW.id;
RETURN NEW;
END
$func$ LANGUAGE plpgsql;
Trigger:
CREATE TRIGGER tab1_before_insert
BEFORE INSERT ON schema.tab1
FOR EACH ROW EXECUTE PROCEDURE schema.tab1_number_inc();
The FM modifier removes leading blanks from to_char() output:
Remove blank-padding from to_char() output

RavenDB Sql Replication and Postgres uuid

I have set up Sql Replication using Postgres/Npgsql.
We are using Guids for ids in Ravendb.
Everything is working fine as long as my id column in Postgres is of type varchar, but if I set it to uuid, which should be the correct type to match Guid, it fails.
It also fails for other columns than id.
Postgres log gives me:
operator does not exist: uuid = text at character 34
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Postgres schema looks like this:
CREATE TABLE public.materiels
(
id uuid NOT NULL,
type character varying(50),
nummer integer,
...
CONSTRAINT materiels_pkey PRIMARY KEY (id)
)
Replacing first line with
id character varying(50) NOT NULL
will make it work.
My replication setup looks like this:
If I set the replication up to use MSSql it works using MSSql's uniqueidentifier data type.
If you want to compare UUID with TEXT, then you need to create operators for that. The one solving your error would look like this:
CREATE FUNCTION uuid_equal_text(uuid, text)
RETURNS boolean
LANGUAGE SQL IMMUTABLE
AS
$body$
SELECT $1 = $2::uuid
$body$;
CREATE OPERATOR =(
PROCEDURE = uuid_equal_text,
LEFTARG = uuid,
RIGHTARG = text);
EDIT: Alternate solution suggested by author of this question himself:
CREATE CAST (text AS uuid)
WITH INOUT
AS IMPLICIT;

Altering a Postgres integer column to type boolean

I've recently been optimizing some of our Postgres tables by converting more complex data types to simpler ones where possible. In every case except one so far, this has been fairly straightforward, for instance:
ALTER TABLE products ALTER COLUMN price TYPE integer USING price::integer;
For converting text into custom enumerated data types, this has also been simple enough. I just wrote a PLPGSQL function that would convert text to the enum, then converted the column like so:
ALTER TABLE products ALTER COLUMN color TYPE color_enum USING text_to_color_enum(color);
This syntax fails though, in cases where I have to convert an integer to a boolean. These all fail:
ALTER TABLE products ALTER return_policy TYPE boolean USING return_policy > 0;
ALTER TABLE products ALTER return_policy TYPE boolean USING bool(return_policy);
ALTER TABLE products ALTER COLUMN return_policy TYPE boolean USING bool(return_policy);
ALTER TABLE products ALTER COLUMN return_policy TYPE boolean USING CASE WHEN return_policy <> 0 THEN TRUE ELSE FALSE END;
The error message is always the same:
ERROR: operator does not exist: boolean = integer
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
There are no null values in that column. All values are either zero or positive. SELECT pg_typeof(return_policy) FROM products LIMIT 1; returns integer. Creating a custom cast from integer to boolean fails, because apparently one already exists. The same thing happen in Postgres 9.4 and 9.5. What am I doing wrong here?
Verify if the column is a constraint, if yes you need to remove the constraint before change the type.
ALTER TABLE products ALTER COLUMN price DROP DEFAULT;
ALTER TABLE products ALTER price TYPE bool USING CASE WHEN price = 0 THEN FALSE ELSE TRUE END;
ALTER TABLE products ALTER COLUMN price SET DEFAULT FALSE;
One of my partial indexes had a condition WHERE return_policy = 30. (This number is meant to be the number of days the return policy is, but since we're giving everything either no return policy or a 30-day return policy, it doesn't make sense for it to be an int anymore.) Dropping the index allowed my original SQL code to run correctly.

How to change column datatype from character to numeric in PostgreSQL 8.4

I am using following query:
ALTER TABLE presales ALTER COLUMN code TYPE numeric(10,0);
to change the datatype of a column from character(20) to numeric(10,0) but I am getting the error:
column "code" cannot be cast to type numeric
You can try using USING:
The optional USING clause specifies how to compute the new column value from the old; if omitted, the default conversion is the same as an assignment cast from old data type to new. A USING clause must be provided if there is no implicit or assignment cast from old to new type.
So this might work (depending on your data):
alter table presales alter column code type numeric(10,0) using code::numeric;
-- Or if you prefer standard casting...
alter table presales alter column code type numeric(10,0) using cast(code as numeric);
This will fail if you have anything in code that cannot be cast to numeric; if the USING fails, you'll have to clean up the non-numeric data by hand before changing the column type.
If your VARCHAR column contains empty strings (which are not the same as NULL for PostgreSQL as you might recall) you will have to use something in the line of the following to set a default:
ALTER TABLE presales ALTER COLUMN code TYPE NUMERIC(10,0)
USING COALESCE(NULLIF(code, '')::NUMERIC, 0);
(found with the help of this answer)
Step 1: Add new column with integer or numeric as per your requirement
Step 2: Populate data from varchar column to numeric column
Step 3: drop varchar column
Step 4: change new numeric column name as per old varchar column