How to take take backup of existing Pact contracts from Pact-broker? - pact-broker

Is there a way to download all pacts and its history from PactBroker? I want to download all the pacts with any history and then upload to a new pactbroker server. Is there a way to do that? I didn't see a doc on this.
Thanks

If you use PactFlow then request them (hello#pactflow.io) directly. Otherwise, you have PostgresSQL server for Pact Broker, you can dump it or use some tools like pgloader to migrate

Related

How to make Metabase write scheduled files into STFP instead of E-mails

I use Metabase to generate dashboards and reports. I need to generate files using the scheduler and instead of sending them by e-mail, I need to make them available in SFTP. Do you have any suggestion on how to automate this processes ?
I use PostgreSQL as a database source.
I can also try other open source tools if needed.
I didn't find much information on how to do it yet.

How to take backup of Tableau Server Repository(PostgreSQL)

we are using 2018.3 version of Tableau Server. The server stats like user login, and other stats are getting logged into PostgreSQL DB. and the same being cleared regularly after 1 week.
Is there any API available in Tableau to connect the DB and take backup of data somewhere like HDFS or any place in Linux server.
Kindly let me know if there are any other way other than API as well.
Thanks.
You can enable access to the underlying PostgreSQL repository database with the tsm command. Here is a link to the documentation for your (older) version of Tableau
https://help.tableau.com/v2018.3/server/en-us/cli_data-access.htm#repository-access-enable
It would be good security practice to limit access to only the machines (whitelisted) that need it, create or use an existing read-only account to access the repository, and ideally to disable access when your admin programs are complete (i.e.. enable access, do your query, disable access)
This way you can have any SQL client code you wish query the repository, create a mirror, create reports, run auditing procedures - whatever you like.
Personally, before writing significant custom code, I’d first see if the info you want is already available another way, in one of the built in admin views, via the REST API, or using the public domain LogShark or TabMon systems or with the Addon (for more recent versions of Tableau) the Server Management Add-on, or possibly the new Data Catalog.
I know at least one server admin who somehow clones the whole Postgres repository database periodically so he can analyze stats offline. Not sure what approach he uses to clone. So you have several options.

data cataloging for sql server using Amundsen,lyft

I would like to use a open source data cataloging tool in my company and was evaluating Amundsen for the same. I have a lot of sql server on my onPremise and would like to catalog all those in Amunsen. Currently for POC I am using docker containers for Amundsen on my local machine.
Could not find any help in cataloging my sql server tables. Can anyone please help me on how to do it in Amundsen, Lyft
ran into the same problem.
found that the SQL server extractor had a few bugs, so I made a pull request to fix them and added a sample script to build.
https://github.com/lyft/amundsendatabuilder/pull/272

Is there any other way other than MLab/ObjectRocket to migrate the app from parse

I have created an EC2 AWS instance with rocksDb engine now I am trying to migrate the parse application to this Instance as instructed here
https://gyminutesapp.wordpress.com/2016/04/28/migrating-parse-mongodb-to-mongorocks-on-aws
Is it compulsory that I have to do it via MLab/ObjectRockect or is there any other way??
Can Anyone help me out with the further steps, How to connect to parseServer and migrate the data?
You can move to any mongodb database. You can setup any server and install mongodb, allow remote access, and push your data from parse.com to your own mongodb database. This is the first step in parse migration process.
Below are the other steps to take care :
1. Host the open source parse server, configure it to connect to your database.
2. Fix your cloud code, minor changes might be required.
3. Replace the cloud modules that you are using with npm module counterpart.
deploy !!!

How to transfer Mongo Database from one remote server to another

I need to transition several databases from one remote, cloud-based server/service (modulus.io) to another (Compose.io). As far as I'm aware, I don't have console access on the target server, which seems to be required for using mongocopy or mongorestore. I have all of the credentials. How do I do this? What command should I use, or is there a tool designed for the purpose?
I'm currently trying to use mongodump to move the database to my local machine, and then try to mongorestore it to the target machine. This is going very slowly, even for a modestly sized database (<2GB) it looks like it will take most of a day to download).
Thanks
In the Compose.io Web UI, create a new DB and click "import". There you can choose the source DB to import. Works every time! :)
I don't think this feature is available on the free tier.