My table's metadata is on Glue, with a description and comments on the columns, as it shows at the picture bellow.
I would like to retrieve these data through Redshift. Is it possible...? Thanks!
Related
as a developer, I have book table in Postgres with bookId, title, description. and have book_tracking have bookId, viewCount.
Now I want to get viewCount from clickhouse, then mix it with title, description then provide it in a materialized table in postgres. How can I do that?
example:
POSTGRES TABLE
CLICKHOUSE TABLE
EXPECT POSTGRES TABLE
books
books_tracking
mv_books
bookId
bookId
bookId
title
view_count
title
description
description
view_count (from clickhouse)
the first solution is: create Foreign Data Wrappers (FDW) (look like not stable, hard to setup)
second solution is: create a table, update it manually by cronjob. truncate it then fill it with data from clickhouse every 5 minute (I'm using this way)
is there any better solution?
Any one knows best way for loading delta or CDC with using any tools
I got big table with billions of records and want to update or insert like Merge in Sql server or Oracle but in Amazon Redshift S3
Also we have loads of columns as can't compare all columns as well
e.g
TableA
Col1 Col2 Col3 ...
It has say already records
SO when inserting new records need to check that particular record is already existing if so no insert if not insert and if changed update record like that
I do have key id and date columns but as its got 200+ columns not easy to check all columns and taking much time
Many thanks in advance
I need to convert text data in a table to large object data in another table. So the table structure is :-
Employee->
id (character varying(130)),name (character varying(130)), description (text)
EmployeeDetailed ->
detailed_id(character varying(130)), desc_lob (oid)
What query can I run in order to transfer all the rows from Employee table to EmployeeDetailed table so that detailed_id would be populated from Employee's id columns and description would be converted to large object and oid would be inserted in desc_lob.
Can I use lo_import(), would it help here?
lo_import() is a client interface command. You can use an INSERT statement, using the result of a SELECT, and use lo_from_bytea inside that SELECT clause:
INSERT INTO EmployeeDetailed (detailed_id, desc_lob)
SELECT id, lo_from_bytea(0, convert_to(description, 'LATIN1'))
FROM Employee
Change LATIN1 for whatever encoding you might like (see this answer)
When I insert a row to a table which has some rows already in oracle, the new inserting row gets inserted somewhere middle instead of at the bottom of table. When inserting again a new row it follows below the row just added.
Why this happens?
Tables in databases typically represent an unordered collection of data. In Oracle, tables are by default heap-organized tables and do not store data in order.
If ordering is important in your data, consider an index-organized table. For Oracle, more information on that can be found here: Overview of Tables
Sharing your table definition would help in confirming that for you.
I got a requirement to query all the tables in a schema to get the max insert date of individual table and I need to circulate this as a report to other team.
Please suggest the approach to write this query.