We have a requirement of implementation of CI CD for the below PAAS components via VSTS Build -release pipelines:
1)Azure automation
2)Azure Data Warehouse
3)ADF v1
4)ADF v2
5)Key vault
6)Azure Storage
Any document pertaining to any of the above components would be very helpful.
I am looking for documentation specific to the build & release pipelines.
Any help is much appreciated.
Its not really a good idea to post such a broad question like this. But have you tried the official documentations?
Azure Devops
azure pipelines
Azure KeyVault
Azure Storage
Azure SQL Data Warehouse
Azure Active Directory V.1
Azure Active Directory V.2
Also try out the chat service for azure devops at aka.ms/devchat. They usually are helpful. :)
Related
I have an Hybris repo on Azure Devops and wanted to deploy on SAP Commerce Cloud(CCv2).'ยจ
I'm new on Azure Devops so not sure how to proceed, how to connect from Azure to CCv2 etc.
I have already created "Staging" and "Prod" spaces on CCv2 and now I want to deploy only on staging.
If someone already created this kind of task before, is it possible to explain me the steps one-by-one.
For example how can I connect from Azure to CCv2? With some commands or security files etc?
Thanks!
I am looking for the best way to create CI/CD pipeline for GCP BigQuery using Azure DevOps.
I need repositories in Azure Devops, and automatically have CI and CD to our Dev/QA and PROD environments in GCP. Also, how to set up automatic builds in Azure DevOps for GCP master branch code? Is there a way to integrate BigQuery and Azure Devops repositories ?
We have to use Azure DevOps for Project, repository and pipeline management. So trying ways to build and deploy to GCP Bigquery from Azure Devops. Any insights would be helpful.
First of all congratulations on your first post.
Yes definitely you can integrate Azure Devops with BigQuery.
A good read here.
But from your question, I failed to understand what you are trying to deploy. Is it some client code which will connect to BQ to run query and fetch data or is it a Dataflow job starting from ingesting data?
There are different ways to do each of them.
I would like to have some Azure DevOps organisation specific configuration settings that can be consumed by Azure Pipelines. So far I have only seen the "library" which is project specific.
Is there anything that I can use on an organisation level?
Thank you
I am afraid there is no out-of-the-box option in azure devops to set the variable group on organization level, variable groups is in Team project level.
Until now, in our official feature suggestion for Azure Devops forum, there has been a such suggestion exist in it: Library of Variable and Task groups shared between all projects. You can comment and vote it there. Our PM and Product Group are reviewing these suggestion regularly and considering take it as plan.
As a workaround, you can use the Azure Key Vault task to pull data from Azure Key Vault. You can have as many Vaults as you like and use in one or all projects.
I am trying to figure out how to import my external test results into Azure DevOps.
This article describes how to publish the results within the same pipeline. But that's not going to help me. We do use Azure pipelines to build and deploy our solution, but the testing is run after that (in a deployed environment), so, outside of the pipeline. We can collect the results (in a Azure DevOps suitable format) and would like to feed it back into Azure DevOps release that did the deployment.
All tips are welcome.
How to migrate on premises Jenkins jobs to Azure DevOps? Is any plugins available?
I know manually how we can create Azure DevOps pipeline.
There is some available guidance from Microsoft on migrating from Jenkins to YAML builds in Azure Pipelines, but there are no plug-ins or tools that will automate this for you at the moment.
There is, however, a way to trigger Jenkins builds from Azure Pipelines using the Jenkins Integration extension. This may be a good option as you transition.
No, there are no plugins available for that.