postgres: Insert multiple rows into a table with id from other table if not exists insert to other table - postgresql

I have done similar task where I can insert a row into a table if data doesn't exists:
WITH
user_exists AS (
Select id from users where username='%s'
),
user_new AS (
INSERT INTO users (username)
SELECT w.username FROM (values ('%s')) w(username)
WHERE not exists
(SELECT 1 FROM users u WHERE u.username = w.username)
returning id
)
INSERT INTO feedbacks ('static_row', userid)
SELECT
'static_data',
(SELECT id FROM users_exists UNION ALL SELECT id FROM users_new) AS userid
Above works well when we insert a new row to feedbacks table. If user doesn't exists it inserts data in users table and returns id which is used for inserting data to feedbacks table.
But now my use case is, I have to insert multiple rows into the feedback table. Something like this:
user_variable = ['a','b', ...]
Insert into feedbacks ('static_row', userid)
VALUES
('sample_data', (Select if from users where username='a')),
('sample_data', (Select if from users where username='b')),
('sample_data', (Select if from users where username='c'))
For above case, how we can insert a new row to users table if username='b' doesn't exists.

Something like this might work for you:
Insert into feedbacks (username)
select distinct uname from
(values
('sample_data'),
('sample_data2'),
('sample_data3')
) s(uname)
where not exists(select 1 from feedbacks where username=uname);

Related

Why this SELECT of the recent added records doesn't work

I was trying to SELECT records that were just inserted, but it doesn't seen to work.
Example:
create table tt_teste (id bigserial, descricao varchar);
with inseridos as (
insert into tt_teste (descricao) values('Test 1') returning id
)
select *
from tt_teste
where id in (select id from inseridos);
I tried to rewrite in another way but the result is the same.
with inseridos as (
insert into tt_teste (descricao) values('Test 2') returning id
)
select *
from inseridos i
join tt_teste t on t.id = i.id;
The result is always empty. Even if I change the WHERE to "where 1=1 or id in (select id from inseridos)" the new records don't show up. They show up in the next run.
I am doing this because I want to SELECT and INSERT in another table with more data coming from a JOIN, but the database can't select the records just inserted. Seens some kind of concurrency issue.
You can do
with inseridos as (
insert into tt_teste (descricao) values('Test 1') returning id
)
select *
from inseridos;
or if you want some other field you just inserted, then
with inseridos as (
insert into tt_teste (descricao) values('Test 1') returning *
)
select *
from inseridos;
There shouldn't be much more to retrieve, unless you have some computed values, triggers filling some fields or some other automation which you didn't tell in your question.

Generate N Records

Is there a faster way to do something like this? Maybe using generate_series?
INSERT INTO users (email)
VALUES
('user1#domain.com'),
('user2#domain.com'),
('user3#domain.com')
Is it possible to also easily add a relation for a profile?
INSERT INTO profiles (user_id)
SELECT id FROM users WHERE email='user1#domain.com'
INSERT INTO profiles (user_id)
SELECT id FROM users WHERE email='user2#domain.com'
INSERT INTO profiles (user_id)
SELECT id FROM users WHERE email='user3#domain.com'
You may try:
WITH cte AS (
SELECT generate_series(1, 3, 1) AS num
)
INSERT INTO users (email)
SELECT 'user' || num::text || '#domain.com'
FROM cte;
Modify the second parameter to generate_series if you want to generate more than 3 user emails.
Edit: For your updated requirement:
WITH emails AS (
SELECT 'user' || generate_series(1, 3) || '#domain.com' AS email
)
INSERT INTO profiles (user_id)
SELECT id FROM users WHERE email IN (SELECT email FROM emails);
INSERT INTO users
SELECT 'user' || generate_series(1, 10) || '#domain.com'
generate_series(1, 10) - This generates a series of 1 to 10 rows
As per the updated requirement,
INSERT INTO profiles (user_id)
SELECT id
FROM users
WHERE email in
(SELECT 'user' || generate_series(1, 10) || '#domain.com')

Split data from a column and insert to another table in postgresql

I'm trying to split data from Username column in a table like this:
Table1
ID Username
1 UserA,UserB,UserC
and I want to insert it to another table. the result will be like this:
Table2
ID Username
1 UserA
1 UserB
1 UserC
is this possible to do this in postgresql?
thanks in advance
You can split the value and then unnest it:
insert into table2 (id, username)
select t1.id, ut.username
from table1 t1
cross join unnest(string_to_array(t1.username), ',')) as ut(username)

Create new entries for a particular account_id in the same using postgres

I have a table called account(id,account_id,name,status). Already data is present for these columns say:
Table account:
I have to first query the account_id with the name as xyz and create new entries for that account_id with name as kjf and lmn and status as fail.
The new table will look like as below after insert
Can someone help me for writing a query for this? I had tried :
INSERT INTO account (id, account_id, name, status,)
SELECT uuid_generate_v4(), account_id, 'kjh', 'fail' FROM account;
This shows error as account_is is unique.
With SQL, you can try this:
with
v1 as (select max(id)+1 as maxid from account),
v2 as (select account_id as newid from account where name='xyz')
insert into account
select (select maxid from v1), (select newid from v2), 'kjh', 'fail';

PostgreSQL DELETE FROM (SELECT * FROM table FETCH FIRST 10 ROWS ONLY)

How do I delete only a few rows in postgreSQL?
I want to fetch 10 rows to delete in a subquery.
My table
You need to use a where condition as per your requirement like this:
delete from mytable where id in(1,2,3,4,5,6,7,8,9,10)
or
delete from mytable where id in(select id from mytable where someconditon)
or you can try like this if you want to delete top 10 using ctid:
DELETE FROM mytable
WHERE ctid IN (
SELECT ctid
FROM mytable
GROUP BY s.serialId, s.valuetimestamp
ORDER BY s.serialId
LIMIT 10
)
If you are looking to remove the duplicates from your table then try this:
DELETE FROM mytable
WHERE ctid NOT IN
(SELECT MAX(s.ctid)
FROM table s
GROUP BY s.serialId, s.valuetimestamp);
If you have some unique identifier (serial, let's call it "id") in your table, then just make something like :
DELETE FROM table WHERE table.id IN (SELECT table.id FROM table WHERE *whatever*)
Add or not something like "LIMIT 0,10"