How to convert JSONB[] to JSONB? - postgresql

I have this table with existing data:
CREATE TABLE table (
col JSONB[],
);
How can I convert col to JSONB now without dropping the column?
I tried:
ALTER TABLE table
ALTER COLUMN col TYPE JSONB USING col::jsonb
But it says cannot cast type jsonb[] to jsonb

This did the trick:
ALTER TABLE table
ALTER COLUMN blocks col JSONB USING to_json(col)

Related

alter varchar field to super returns " ERROR: target data type "super" is not supported"

I have a varchar field in super/json format:
select detail
from mytable
limit 1;
{"child_category":"organize","gallery_id":"123456","detail":"[\"789876"]"}
I know that detail is currently varchar because inspecting the table in my client shows it as varchar(32768)
I want alter this field to be super:
ALTER TABLE mytable ALTER COLUMN detail TYPE super;
Returns:
[0A000] ERROR: target data type "super" is not supported
How can I cast the detail field as a super field?
From the doc https://docs.aws.amazon.com/redshift/latest/dg/r_ALTER_TABLE.html
with alter column you can only change varchar size.
As a suggestion, you can add new temp column, drop first, and rename temp column
alter table mytable add column temp_super super;
update mytable set temp_super = json_parse(detail);
alter table mytable drop column detail;
alter table mytable rename column temp_super to detail;

How to change a default separator for postgresql arrays?

I want to import csv with Postgres' arrays into a Postgres table.
This is my table:
create table dbo.countries (
id char(2) primary key,
name text not null,
elements text[]
CONSTRAINT const_dbo_countries_unique1 unique (id),
CONSTRAINT const_dbo_countries_unique2 unique (name)
);
and I want to insert into that a csv which looks like this:
AC,ac,{xx yy}
When I type copy dbo.mytable FROM '/home/file.csv' delimiter ',' csv; then the array is read as a one string: {"xx yy"}.
How to change a deafault separator for arrays from , to ?
You cannot to change array's separator symbol. You can read data to table, and later you can run a update on this table:
UPDATE dbo.countries
SET elements = string_to_array(elements[1], ' ')
WHERE strpos(elements[1], ' ') > 0;

Alter column default value is not working in postgres

I tried to update column's default value with the below queries in postgres. But seems its not working. May be I am missing something. Could you help?
ALTER TABLE tableName ADD COLUMN newColumn INTEGER DEFAULT 0;
ALTER TABLE tableName ALTER COLUMN newColumn DROP DEFAULT;
or
ALTER TABLE tableName ALTER COLUMN newColumn SET DEFAULT NULL;
SELECT * FROM tableName;
Here I still find 0.
The change only applies to new records. After the modification you have to heal all the previous data with a migration like this:
UPDATE tableName SET newColumn = NULL WHERE newColumn = 0

Unable to change field type to not null with default value

I have a simple SQL statement, which looks like so:
alter table my_table alter column my_field set data type numeric(12,4) not null default 0;
But I get an error message, that points to not. What is wrong with that?
Use separate ALTER COLUMN clauses for the type, null behavior, and default value:
ALTER TABLE my_table
ALTER COLUMN my_field TYPE numeric(12,4),
ALTER COLUMN my_field SET DEFAULT 0,
ALTER COLUMN my_field SET NOT NULL;

Change column type and set not null

How do you change the column type and also set that column to not null together?
I am trying:
ALTER TABLE mytable ALTER COLUMN col TYPE character varying(15) SET NOT NULL
This returns an error.
What is the right syntax?
This should be correct:
ALTER TABLE mytable
ALTER COLUMN col TYPE character varying(15),
ALTER COLUMN col SET NOT NULL
Also, if you want to REMOVE NOT NULL constrain in postgresql:
ALTER TABLE mytable
ALTER COLUMN email DROP NOT NULL;