Export and Import Synapse Triggers - azure-data-factory

I'm setting up secondary region for my synapse workspace, is there a way I can export all the triggers from one workspace to another?

You have 3 options, as far as I can see:
set up Git and Devops integration between your 2 workspaces and then set up a release pipeline to copy everything from one workspace to the other. Here is a link to the documentation. This is the best way if you have a lot to copy, and\or want to create a way to copy between environments on a regular basis.
Build a PowerShell script to get information about triggers from one workspace and then create them in the second workspace. Try the commands Get-AzSynapseTrigger to copy from one workspace and Set-AzSynapseTrigger to create on the new environment.
If you have few triggers, simply copying them is the simplest, thought programmatically disappointing solution.

Related

How to backup the data on Azure Devops?

I would like to schedule (for my company) a backup of our most important data in Azure DevOps, and that, for different reasons : security, urgent recovery required, virus, migration, etc...
I can execute a backup of the repositories and the Wikis (because it's under GIT, so easy to download), but how can do a backup of the "Board" section (Backlogs, Work items, etc...), and the build pipelines definitions?
How to backup the data on Azure Devops?
In current Azure DevOps, there is no out of the box solution to this. You could manually save the project data through below ways:
Source code and custom build templates: You can download your files
as a zip file. Open ... Repository actions actions for the
repository, file, or folder and choose Download as Zip. You can
also Download from the right side of the screen to download
either all of the files in the currently selected folder, or the
currently selected file.
This process doesn't save any change history or links to other
artifacts.
If you use Git, clone your repositories to retain the full project
history and all the branches.
Build data: To save logs and data in your drop build folders, see
View build results.
Work item tracking data: Create a work item query and open it using
Excel. Save the Excel spreadsheet.
This process doesn't save any attachments, change history, or links
to other artifacts.
build/release defintions: you could export the json file for them and then import them when restoring them.
There has been a related user voice, you could monitor and vote up it: https://developercommunity.visualstudio.com/content/idea/365441/provide-a-backup-service-for-visual-studio-team-se.html.
Here are some tickets(ticket1 ,ticket2) with the same issue you can refer to.
If you want to create scheduled tasks, you can write a script by using the Azure CLI with the Azure Devops Extension
As you said, for the repositories, it's quiet easy as they are Git repositories.
I wrote such a script that we could improve to also backup the Workitems, Backlog, etc...
It's open source, let me know what you would like to backup first and I'll improve it.
Github : azure-devops-repository-backup

How to maintain hundred different Terraform configs?

we created a Terraform template which we will deploy in future many dozen times in different workspaces. Each workspace will have a separate configuration file.
Now we would like to automate this procedure and think about keeping the configuration files in a Git Repo.
Has anyone a best practise how to store the configuration files in a Git Repo and trigger a CICD workflow (Azure DevOps)?
In general we only would like to apply changes for workspaces that have a changed configuration.
The terraform plan and apply command have an option for you to pass in the tfvars file you want to use. So something like this:
terraform apply --var-file=workspace.tfvars
So in the pipeline you would grab your terrafom template artifacts and your config files. I would then set a TF_WORKSPACE variable to force your workspace and I would also make your tfvars files match the workspace name so you can re use the variable in your apply command. This would force your workspace and configuration file to match.
To trigger this when those files have changed would require a path trigger that would trigger on those changes.
I don't see any harm in running the Terraform every time regardless if changes occur. The worse possible outcome would be that someone made a change that isn't in Terraform and it gets undone.

Reusing PowerShell Scripts in Azure DevOps

I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.

Share the same Powershell script file between multiple repo/Build

We are using VSTS for CI and CD in my team, we got over 40 repositories which are separated projects. but all of them have to run the same PowerShell script in one of their Build steps.
the PowerShell file is bigger too big to be kept as the inline script, so we need to save it inside a file. obviously, I got a copy of the PowerShell file in each repository.
Problem:
Now whenever I need to update the script, then I end up to update it in every repository, which is over 40 at the moment.
I think there should be a better approach. Is there any way that I can put my script in one single repo (a repo dedicated to holding the script) then I use it within each build, therefore we I need to update it I only need to update it once.
There are a few options.
My general recommendation is to publish the script as a package (NuGet or otherwise) and restore it during your application builds. This allows consumers to stay "pinned" to a known-good, known-working version, and update on a schedule that works for them.
Another option is to add a submodule to each repository that requires the script dependency, then initialize the submodule during the build process.
A third option is to turn the shared script into a VSTS build task or extension. This is extensively documented and easily located so I won't belabor the point by including instructions for doing that here.
You can add a git repository to store your powershell file.
Then add a build step to get you file from that repository during build and use it.

TYPO3: Publish workspace edit via scheduler?

I managed to run the scheduler with a cron task.
I managed to auto publish a whole workspace on the publish date with the scheduler.
But I don't want the whole workspace to be published on the scheduler task, but only a single edit.
I tried to give the edit in the workspace a publish date, but that didn't work out.
Is this even possible?
TYPO3 version: 4.5.x
You could create another Workspace for this single edit. And publish this different workspace where you have only one edit.
If you give some more details, perhaps it is possible to do it without workspaces? F.e. you can create two content elements with different start and stop dates.