Implement Oracle external table like functionality in Azure managed postgresql - postgresql

Currently we are using Oracle 19c external table functionality on-prem whereby CSV files are loaded to a specific location on DB server and they get automatically loaded into an oracle external table. The file location is mentioned as part of the table DDL.
We have a requirement to migrate to azure managed postgresql. As per checking the postgresql documentation, similar functionality as oracle external table can be achieved in standalone postgresql using "foreign tables" with the help of file_fdw extension. But in azure managed postgresql, we cannot use this since we do not have access to the DB file system.
One option I came across was to use azure data factory but that looks like an expensive option. Expected volume is about ~ 1 million record inserts per day.
Could anyone advise possible alternatives? One option I was thinking was to have a scheduled shell script running on an azure VM which loads the files to postgresql using PSQL commands like \copy. Would that be a good option for the volume to be supported?
Regards
Jacob

We have one last option that could be simple to implement in migration. We need to use Enterprise DB (EDB) which will avoid the vendor lock-in and also it is free of cost.
Check the below video link for the migration procedure steps.
https://www.youtube.com/watch?v=V_AQs8Qelfc

Related

Automate data loading to Google Sheet from PostgreSQL database

I would like to create an automated data pulling from our PostgreSQL database to a Google sheet. I've tried JDBC service, but it doesn't work, maybe incorrect variables/config. Does anyone already try doing this? I'd also like to schedule the extraction every hour.
According the the documentation, only Google Cloud SQL MySQL, MySQL, Microsoft SQL Server, and Oracle databases are supported by Apps Script's JDBC. You may have to either move to a new database or develop your own API services to handle the connection.
As for scheduling by the hour, you can use Apps Script's installable triggers.

How to take backup of Tableau Server Repository(PostgreSQL)

we are using 2018.3 version of Tableau Server. The server stats like user login, and other stats are getting logged into PostgreSQL DB. and the same being cleared regularly after 1 week.
Is there any API available in Tableau to connect the DB and take backup of data somewhere like HDFS or any place in Linux server.
Kindly let me know if there are any other way other than API as well.
Thanks.
You can enable access to the underlying PostgreSQL repository database with the tsm command. Here is a link to the documentation for your (older) version of Tableau
https://help.tableau.com/v2018.3/server/en-us/cli_data-access.htm#repository-access-enable
It would be good security practice to limit access to only the machines (whitelisted) that need it, create or use an existing read-only account to access the repository, and ideally to disable access when your admin programs are complete (i.e.. enable access, do your query, disable access)
This way you can have any SQL client code you wish query the repository, create a mirror, create reports, run auditing procedures - whatever you like.
Personally, before writing significant custom code, I’d first see if the info you want is already available another way, in one of the built in admin views, via the REST API, or using the public domain LogShark or TabMon systems or with the Addon (for more recent versions of Tableau) the Server Management Add-on, or possibly the new Data Catalog.
I know at least one server admin who somehow clones the whole Postgres repository database periodically so he can analyze stats offline. Not sure what approach he uses to clone. So you have several options.

TSQL: How do I move data between SQLServer instances?

I have two SQLSever instances, each of them have an identical schema. One is running in SQLAzure, the other is a standared SQLServer 2008 instance. I need to copy the data from the Azure database to my local instance.
Essentially I want to do this:
insert LOCAL_TABLE (col1, col2)
select col1, col2
from AZURE_TABLE
How would I do that?
In order to move data between SQL Servers, and if one of them is SQL Azure you have couple of options:
SQL Azure Data Sync
Using SSIS
Write your own code that will move data using, most probably SqlBulkCopy class.
If you would like to just copy all the data, you could also use SQL Azure Migration Wizard - you can omit the option for coping the schema, and let it just copy the data.
EDIT
And as, by the original answer from Matthew PK, you could link to your SQL Azure server from your on-prem Server, but this is only an option when you just want to do some ad-hoc testing. I would not use this approach in production for constantly syncing data.
You could accomplish that in a single statements using linked servers.
http://msdn.microsoft.com/en-us/library/ms188279(v=sql.100).aspx
EDIT: Here is a link which appears to explain how to link to SQL Azure:
http://blogs.msdn.com/b/sqlcat/archive/2011/03/08/linked-servers-to-sql-azure.aspx
EDIT: Here is a write-up on connecting to Azure with SSMS
http://www.silverlighthack.com/post/2009/11/11/Connecting-to-SQL-Azure-with-SQL-Server-Management-Studio-2008-R2.aspx
Otherwise I believe you need to do it in two statements.
Linked Server is not officially supported. However, Here are couple of resources that are supported and would help you do what you are looking for:
1) Check out SQL Azure Dac Examples: http://sqldacexamples.codeplex.com/
2) The other options is SQL Azure Data SYNC.
Use a product like "SQL Data Compare" from redgate. I am not a Azure user, but I am guessing it would work, I have used it for a few years and its pretty solid.
Is this a one-time copy or ongoing?
If one-time, then use the SQL Azure Migration Wizard (from Codeplex)
If ongoing then use SQL Azure data sync
Also you can verify that the schema is compliant with SQL Server Data Tools in VS, just set the target to SQL Azure or to SQL Server 2012, or 2008 and then build and any/all schema errors will show up.

How do I setup DB2 Express-C Data Federation for a Sybase data source?

I wish to make fields in a remote public Sybase database outlined at http://www.informatics.jax.org/software.shtml#sql appear locally in our DB2 project's schema. To do this, I was going to use data federation, however I can't seem to be able to install the data source library (Sybase-specific file libdb2ctlib.so for Linux) because only DB2 and Infomatix work OOTB with DB2 Express-C v9.5 (which is the version we're currently running, I also tried the latest V9.7.)
From unclear IBM documentation and forum posts, the best I can gather is we need to spend $675 on http://www-01.ibm.com/software/data/infosphere/federation-server/ to get support for Sybase but budget-wise that's a bit out of the question.
So is there a free method using previous tool versions (as it seems DB2 Information Integrator was rebranded as InfoSphere Federation Server) to setup DB2 data wrappers for Sybase? Alternatively, is there another non-MySQL approach we can use, such as switching our local DBMS from DB2 to PostgreSQL? Does the latter support data integration/federation?
DB2 Express-C does not allow federated links to any remote database, not even other DB2 databases. You are correct that InfoSphere Federation Server is required to federate DB2 to a Sybase data source. I don't know if PostgreSQL supports federated links to Sybase.
Derek, there are several ways in which one can create a federated database. One is by using the federated database capability that is built in to DB2 Express-C. However, DB2 Express-C can only federate data from specific data sources i.e. other DB2 databases and industry standard web services. To add Sybase to this list you must purchase IBM Federation Server product.
The other way is to leverage DB2 capability to create User Defined Functions in DB2 Express-C that use OLE DB API to access other data sources. Because OLE DB is a Windows-based technology, only DB2 servers running on Windows can do that. What you do is create a table UDF that you can then use anywhere you would expect to see a table result set e.g view definition. For example, you could define a view that uses your UDF to materialize the results. These results would come from a query (via OLE DB) of your Sybase data (or any other OLE DB compliant data source).
You can find more information here http://publib.boulder.ibm.com/infocenter/idm/v2r2/index.jsp?topic=/com.ibm.datatools.routines.doc/topics/coledb_cont.html

how to backup oracle 10g

how i can backup oracle 10g like backup and restore in sql server ?
i want to backup tables and data
thank's in advance
Oracle has a comprehensive backup and recovery suite which is formally called Recovery Manager but is universally known as RMAN. Find out more.
The typical term for this is 'migrate' not 'backup'. You'll then find results in your favorite search engine to tools like:
SQL Server Migration Assistant for Oracle
If you only want table structures and data (no code), you can use SQL Server Integration Services (SSIS) package for this.
Read up here on the various methods and their associated speed
http://sqlcat.com/technicalnotes/archive/2008/08/09/moving-large-amounts-of-data-between-oracle-and-sql-server-findings-and-observations.aspx
Oracle export data pump (expdp) is also good utility to take logical backup of your schema. You have very nice features with lot of flexibility in data pump. You can follow the link given below for further reading.
http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm