Turning cells from row into a column and marking them as a primary key? (Postgresql) - postgresql

So my following table is like this:
Tower_ID|...|Insulator_ID_01|Insulator_ID_02|...|Insulator_ID_12|
Tower_01|...|01_Unique_ID_01|01_Unique_ID_02|...|01_Unique_ID_12|
Tower_02|...|02_Unique_ID_01|02_Unique_ID_02|...|02_Unique_ID_12|
Then the idea is to have a single table for every insulator that belongs to towers in this specific line (towers in this line is the table). But the only way I know how is to have a table for each column of insulators. Is it possible to create a single table with relationships which would store Insulator_ID_01 to Insulator_ID_12 in a column before going into the next row of the and doing the same?

Related

How to select database columns using list of integers in Anylogic?

I have imported a large database in anylogic, having various columns. The rows can be selected using unique primary key in database table. Similarly, how can i move through columns using integer indexes?
The attached picture shows the selection query of encircled cell, to get to other cell i need to change columns again in query which is surely not efficient 1.

Is it possible, when i save a data in postgresql table, automatically, id of record save in another table without trigger

I have 2 tables. First table is called Person. Second table is called PersonRule. Person table has much columns. But PersonRule has just 2 columns. In the person table, there is a column called ruleid. The column at the same time, there is in the PersonRule table. Is it possible, when i insert to data Person table, i want to automatically create a record in PersonRule table without trigger.?
And in PostgreSQL how can i do this.?

How to copy a Specific Partitioned Table

I would like to shift data from a specific paritioned table of parent to separate table.Can someone suggest what's the better way.
If I create a table
CREATE TABLE parent columns(a,b,c)partition by c
c Type is DATE.
CREATE TABLE dec_partition PARTITION OF parent FOR VALUES FROM '2021-02-12' to 2021-03-12;
Now I want to copy table dec_partition to separate table in single command with least time.
NOTE: The table has around 4million rows with 20 columns (1 of the column is in jsonb)

Copy column of a row to another row where condition and then delete it

So I have a table where I'm trying to get rid of some rows.
All these rows contain a letter where they should only contain a numeric value.
Example:Columns
So I pretty much want to copy the column grade_percent of column 1 to 'class_rank' of column 2 and then delete column 1.
The thing is that I have about 9k rows and there are different marking_period_ids
I was thinking of doing something like
UPDATE table SET class_rank=(SELECT exam from table WHERE marking_period_id)
but that's where I get lost as I have no idea how to make this repetitive straight from a postgresql query

Query time after partition in postgres

I have table location in postgres database with more then 50.000.000+ rows, and i decide to do partition!
Table parent have columns id,place and i want to do partition onplace column, with php and mysql i get all distinct places(about 300) and foreach
CREATE TABLE child_place_name (CHECK (place=place_name))INHERITS(location)
and after that in each child table
INSERT INTO child_place_name SELECT * FROM location WHERE place=place_name
and that works perfectly!
After i delete rows from parent class with
DELETE FROM location WHERE tableoid=('location'::regclass)::oid;
i that affected all rows is table!
Then i try to do query and a get times and realize that now is time for query 3 or more times longer then before.
I also have problem that my affect on speed: first i can't set primary key on id column in child tables, but i set index key on place(also index is set on place column in parent table), and also i can't set unique key on id and place columns i got error multiple parameters in not allowed(or something like that)
All i want is select side of table i don't need rules or triggers to set when i insert in parent table,cause that is another problem,only i want to know what is wrong with this approach!Maybe 300+ tables is to much?