as a developer, I have book table in Postgres with bookId, title, description. and have book_tracking have bookId, viewCount.
Now I want to get viewCount from clickhouse, then mix it with title, description then provide it in a materialized table in postgres. How can I do that?
example:
POSTGRES TABLE
CLICKHOUSE TABLE
EXPECT POSTGRES TABLE
books
books_tracking
mv_books
bookId
bookId
bookId
title
view_count
title
description
description
view_count (from clickhouse)
the first solution is: create Foreign Data Wrappers (FDW) (look like not stable, hard to setup)
second solution is: create a table, update it manually by cronjob. truncate it then fill it with data from clickhouse every 5 minute (I'm using this way)
is there any better solution?
Related
Following the blog of Rob Conery I have set of unique IDs across the tables of my Postgres DB.
Now, using these unique IDs, is there a way to query a row on the DB without knowing what table it is in? Or can those tables be indexed such that if the row is not available on the current table, I just increase the index and I can query to the next table?
In short - if you did not prepared for that - then no. You can prepare for that by generating your own uuid. Please look here. For instance PG has uuid that preserve order. Also uuid v5 has something like namespaces. So you can build hierarchy. However that is done by hashing namespace, and I don't know tool to do opposite inside PG.
If you know all possible tables in advance you could prepare a query that simply UNIONs a search with a tagged type over all tables. In case of two tables named comments and news you could do something like:
PREPARE type_of_id(uuid) AS
SELECT id, 'comments' AS type
FROM comments
WHERE id = $1
UNION
SELECT id, 'news' AS type
FROM news
WHERE id = $1;
EXECUTE type_of_id('8ecf6bb1-02d1-4c04-8875-f1da62b7f720');
Automatically generating this could probably be done by querying pg_catalog.pg_tables and generating the relevant query on the fly.
Is there any way to create a postgis table with existing style from layer_styles table? Say for example i have so many styles stored in layer_styles table. I need to assign one of the style from layer_styles table to the new table which i am going to create. Can that be done using postgresql during table creation using sql command?
You need to identify the layer IDs of interest (or name, or any column you want) and to create the new table using this data+structure. However using the style in this secondary table may not be that easy
create table public.layer_styles_2 as
(select * from public.layer_styles where id in (2,3));
I have a table in PostgreSQL represented as the following Go struct:
type AppLog struct {
ID int // set to auto increment in DB, also a primary key
event string
createTime time.Time
}
I configured monthly table partitioning with the above as the base table and a insert trigger to route the data into the child table for the current month using the dateTime values as the partition key.
[the trigger function etc are omitted for brevity]
When I try to insert into AppLog table, Postgres routes the operation to the appropriate child table, say AppLog_2017-05 (current month table), but the insert fails with the following error:
INSERT INTO "app_logs" ("event", "create_time")
VALUES ('device /dev/sdc is now ready','2017-05-26T15:04:30+05:30')
RETURNING "app_logs"."id"
ERROR: sql: no rows in result set
When the same query is run in the Postgres Shell, it runs fine.
Can someone help me understand how to do inserts using GORM in PostgreSQL where table partitioning is configured behind the scene? I am not sure if this is a problem with GORM or the Go PostgreSQL driver or Go Database/SQL package. Or, if I am missing anything.
Any help will be highly appreciated.
I got the answer. It was not an issue with GORM or Go-PQ. I was doing some goof up in my trigger function while returning the inserted values from the child table.
I am working trying to write an insert query into a backup database. I writing place and entities tables into this database. The issue is entities is linked to place via place.id column. I added a column place.original_id in the place table to store it's original 'id'. so now that i entered place into the new database it's id column changed but i have the original id stored so I can still link entities table to it. I am trying to figure out how to write entities to get the new id
so far i am at this point:
insert into entities_backup (id, place_id)
select
nextval('public.entities_backup_id_seq'),
(select id from places where original_id = (select place_id from entities) as place_id
from
entities
I know I am missing something because this does not work. I need to grab the id column from places when entity.place_id = places.original_id. Any help would be great.
I think this is what you want
insert into entities_backup (id, place_id)
select nextval('public.entities_backup_id_seq'), places.id
from places, entities
where places.original_id = entities.place_id;
I am working trying to write an insert query into a backup database. I writing place and entities tables into this database. The issue is entities is linked to place via place.id column. I added a column place.original_id in the place table to store it's original 'id'. so now that i entered place into the new database it's id column changed but i have the original id stored so I can still link entities table to it.
It would be simpler to not have this problem in the first place.
Rather than trying to fix this up after the fact, the better solution is to dump and load places and entities complete with their primary and foreign keys intact. Oracle's EXPORT or a utility such as ora2pg should be able to do it.
Sorry I can't say more. I know Postgres, not Oracle.
After I've migrated a table from another database I cannot see the data in the postgres table. With \d I can see the schemas and tables but with a select using . I cannot see the data.
The table is in schema public, I try:
select * from public.ACCOUNTS;
select * from ACCOUNTS;
As I an new to postgres, excuse my simple questions, I'm sure I oversee the obvious.
BTW, having tried the 'first steps' guide on another page where a schema, user and table is created I did not have this problem.
I have faced this problem when using sequelize-cli Migrations, for some reason
you have to query the data from the Public Schema, Try this
select * from Public."Accounts"