Snowflake connectivity with GIT - github

Is there any way we can connect snowflake with GIT for version control. With the help of that, we can maintain version of our merge statement and any other sql script in GIT.

DBeaver has git integration and is the best solution my team has found for version control with Snowflake. It's not perfect but it allows you to run your scripts against Snowflake and then push your SQL code to a git repository through the app UI or command line.

Yes! One way to do this is to store your Snowflake SQL code in a file/files with the sql extension (i.e. filename.sql). You can add those files to a GIT repo and track them in the repo accordingly.

This is an age old question when dealing with databases and how one should go about versioning them. Unfortunately, no database really integrates directly into any VCS that I'm aware of.
My team has settled on using dbt. This essentially turns the database into a series of text files that are easily integrated with git. The short of it is that you edit your models as local text files, and then use dbt run to put these models into Snowflake itself. This is kind of nice as you can configure separate environments such as dev and prod.

Other answers help with using an IDE as a go-between for git and Snowflake. These projects could be useful also:
https://medium.com/snowflake/snowflake-vs-code-sql-tools-and-github-7eab915e10cb
use VSCode as the IDE with a useful snowflake extension
https://github.com/Snowflake-Labs/schemachange
manage schema changes as script in git, deploy them with CI/CD
https://github.com/Snowflake-Labs/sfsnowsightextensions#get-sfworksheets
the missing feature of SnowSight -- export worksheets

There is now a VSCode extension for Snowflake. I'm able to connect vscode to our repo (Azure DevOps in my case) and Snowflake. It's got some nice features too like being able to easily cycle through past queries (including query results) and gives the same level of detail (or more) than the Snowflake UI.

Related

Merge multiple projects into single project TFS/ADO

I was exploring following GitHub page to understand the migration and merge of projects from Azure DevOps Server to Azure DevOps Services
https://github.com/nkdAgility/azure-devops-migration-tools
I see in the documentation, the following feature was mentioned. But, unfortunately, I could not see any relevant documentation for the same. Please help with this.
Merge many projects into a single project
You do it through the configuration file as part of the tool. It requires tinkering and trial and error to get it working but is a very useful. In the config file you state a source and a target and the amend the parama you want and then run the tool. It not very well documented though but is still a powerful tool.

Sending a file to multiple servers

I'm working on a web project(built with the .Net framework) on a remote windows server, and this project is connected to a database my SQL server management studio, now on multiple other remote windows servers exist the same web project linked to the same database, now I change a page's code in my project or add/remove a table or stored procedure in my database, is there a way(or an already existing software) which will my to deploy the changes that I made to all the others(or to choose multiple servers if I don't want to deploy the changes to all of them)?
If it were me, I would stand up a git server somewhere (cloud or local vm), make a branch called something like Prod or Stable, and create a script (powershell if the servers are windows, bash on anything else) on a nightly or hourly job to pull from that branch. Only push to that branch after testing thoroughly. If your code requires compilation, you have the choice to compile once before committing (in which case you're probably going to commit to releases), or on each endpoint after the pull. I would have the script that does the pull also compile and restart the service (only if there was something new in the pull).
You can probably achieve this by following two things :
Create a separate publishing profile for each server.
Use git/vsts branches to keep the code separate. (as suggested by #memtha).
Let's say you have total 6 servers and two branches A and B. So, you'll have to create 6 publishing profiles. Then, you can choose which branch to deploy where. e.g. you can deploy branch B on server 1,3 and 4.
For the codebase you could use Git Hooks.
https://gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa
And for the database, maybe you could use migrations or something similar. You will need to provide more info about your database, do you store your database across multiple servers etc.
If the same web project is connecting to the same database and the database changes, I suspect you would need to update all the web apps to ensure the database changes don't break any of the apps and to keep all the apps updated to prevent any being left behind.
You should look at using Azure Devops to build and deploy your apps and update the database.
If you use Entity Framework, you can run the migrations on startup and have the application update the database when deployed manually or automatically using devops.
To maintain the software updated in multiple server you could use Git with hooks, post-receive hook is what you need.
The idea is to use one server as your Remote Repository and here configure the post-receive hook to update the codebase in the same server and the others.

Is there any way to implement CI/CD for on premises Postgres SQL using Azure Devops Pipelines?

I want to create one click deployment on azure pipelines to move Postgres Sql changes from dev to QA environment,similar to what we implement using SQL Server Database project where a Powershell script deploy the changes to the remote server.
I have tried pg_dump and psql commands which will create dump file and restore it on the remote server. It does not perform diffing ie(comparing database changes on source and destination , and only replicating the missing changes)
You've stumbled upon one of the features lacking in the Postgres ecosystem. One of the more elegant ways to solve migrations using Postgres' own tooling is to package up your migrations as a Postgres Extension. This requires you to generate the deployment scripts yourself, but it is a neat way of applying and packaging up the deployments.
There are a number of commercial tools that will assist in this process, such as Datical, Liquibase, and Flyway. Note, some of these still require you to generate the change statements yourself, some attempt to create them for you.
Generating change statements is a whole different animal and I recommend you look at schema diffing tools for Postgres to find what best suites your needs.

Saving Redshift Matadata

we are using Redshift as our EDW and we have quite a bit of tables and view there. at the moment we keeping all DDL's in our organisation's knowledge centre, but this is basically copy and paste and not very smart. is there any other option that is quicker better to do so?
thanks
Not very sure what you meant be "copy and paste" but you can try to put all the scripts in a github/ SVN repository and make sure that all the DDLs actually get fired using the scripts from the repo.
We did this using git and Jenkins (and little bit of Shell programs to do the code checkins and checkouts). We blocked all the users from running DDL statements and the Jenkins job would just pull the latest scripts from the repo and deploy it automatically from the RC (release candidate) branch of the repo.
If you need to export the DDL scripts out from the system you can use the script provided by the AWS folks,
https://github.com/awslabs/amazon-redshift-utils/blob/master/src/AdminViews/v_generate_tbl_ddl.sql
If you want to automate the checkin process to some code repository, you can build a wrapper python code using this code.

SSAS Versioning Without Source Control

What is the best way to manage and combine different versions of SSAS solutions, without using version control?
Currently, we have a network drive where the "master" copy is stored. So individual develoeprs work with a local copy, but we recently ran into a problem with adding changes to the "master" copy.
Any suggestions? Microsoft appears to have souce control for SSIS. SSRS is easy enough to migrate by just copy/pasting the rdl files. There seems to be no easy way to accomplish this with SSAS packages.
What version are you talking about? I just recently added an entire SSAS project into source control. There was no issue at all.
We must somehow be talking about two different things.
You could try creating a single master xmla script that holds the entire cube definition. this script would live on the network drive.
The XMLA Script can be generated using the analysis service deployment tool. Then you would have to rely on a diff tool to try and manually merge the changes from each developer into the master file.This would be extremely cumbersome and error prone.
I would recommend just storing the project in source control. As previously mentioned MSAS will work with any version control provider. since it source files are just xml. For best results use a source control provider that integrates with visual studio.