Transforming some postgres table - postgresql

I have some data table like the above left table and it is a snippet, so there are a lot more unique IDs.
I would like to write a postgres code to transform the table to something like the right.

Related

Appending the table inside a PSQL schema

My goal is to append all the tables in the schema. Is there a way to loop through the table in one specific schema, and use those tables to append to each other to create a bigger table? (Here, all my tables have the same format for the columns).

Cloning a Postgres table, including indexes and data

I am trying to create a clone of a Postgres table using plpgsql.
To date I have been simply truncating table 2 and re-inserting data from table 1.
TRUNCATE TABLE "dbPlan"."tb_plan_next";
INSERT INTO "dbPlan"."tb_plan_next" SELECT * FROM "dbPlan"."tb_plan";
As code this works as expected, however "dbPlan"."tb_plan" contains around 3 million records and therefore completes in around 20 minutes. This is too long and has a knock on effects on other processes.
It's important that all constraints, indexes and data are copied exactly to table 2.
I had tried dropping the table and re-creating it, however this did not improve speed.
DROP TABLE IF EXISTS "dbPlan"."tb_plan_next";
CREATE TABLE "dbPlan"."tb_plan_next" (LIKE "dbPlan"."tb_plan" INCLUDING ALL);
INSERT INTO "dbPlan"."tb_plan_next" SELECT * FROM "dbPlan"."tb_plan";
Is there a better method for achieving this?
I am considering creating the table and then creating indexes as a second step.
PostgreSQL doesn't provide a very elegant way of doing this. You could use pg_dump with -t and --section= to dump the pre-data and post-data for the table. Then you would replay the pre-data to create the table structure and the check constraints, then load the data from whereever you get it from, then replay the post-data to add the indexes and FK constraints.

Is there any benefit to using one jsonb column on a table rather than bunch of them for better organization?

Is there any benefit to using one big jsonb column type in a Postgresql table like data with top level keys of address and fee verses a lot of them like address_data and fee_data?

Handling the output of jsonb_populate_record

I'm a real beginner when it comes to SQL and I'm currently trying to build a database using postgres. I have a lot of data I want to put into my database in JSON files, but I have trouble converting it into tables. The JSON is nested and contains many variables, but the behavior of jsonb_populate_record allows me to ignore the structure I don't want to deal with right now. So far I have:
CREATE TABLE raw (records JSONB);
COPY raw from 'home/myuser/mydocuments/mydata/data.txt'
create type jsonb_type as (time text, id numeric);
create table test as (
select jsonb_populate_record(null::jsonb_type, raw.records) from raw;
When running the select statement only (without the create table) the data looks great in the GUI I use (DBeaver). However it does not seem to be an actual table as I cannot run select statements like
select time from test;
or similar. The column in my table 'test' also is called 'jsonb_populate_record(jsonb_type)' in the GUI, so something seems to be going wrong there. I do not know how to fix it, I've read about people using lateral joins when using json_populate_record, but due to my limited SQL knowledge I can't understand or replicate what they are doing.
jsonb_populate_record() returns a single column (which is a record).
If you want to get multiple columns, you need to expand the record:
create table test
as
select (jsonb_populate_record(null::jsonb_type, raw.records)).*
from raw;
A "record" is a a data type (that's why you need create type to create one) but one that can contain multiple fields. So if you have a column in a table (or a result) that column in turn contains the fields of that record type. The * then expands the fields in that record.

How to convert multiple tables with same columns into one table in postgress

I have a Postgres database that carry over 200 tables with the same column names and datatypes respectively. I would like to join all of them into one table, how can I achieve this?
I have Postgres 9.4 and pgAdmin setup.
If the tables have identical column names and types, then you can create a parent table and arrange for all of the other tables to inherit from the parent table. After this, queries on the parent table will automatically query all of the child tables.
First create an empty parent table with the same definition as the 200 tables you already have.
Then, use ALTER TABLE on each of the 200 tables to make them inherit from the parent table.
CREATE TABLE myparenttable( LIKE mychildtable1 );
-- Repeat this for each of the child tables
ALTER TABLE mychildtable1 INHERIT myparenttable;
See also: Inheritance in the postgresql manual.