Deploy SSIS Changes Only through Azure DevOps Pipeline - azure-devops

Recently I created a pipe line in Azure DevOps to deploy the SSIS Package automatically. I was able to complete it successfully but I'm worried weather the complete solution will be deployed every time or just the package which was modified lately. (The way we do through Visual Studio)
Or else we have any option to deploy the changes or deploy the whole.
When I say changes that means only modified object.
My Solution has multiple package but mostly I change only one package at once.

Related

Pipeline on Azure Devops is failing

a month ago we had a solution (big one) in .net framework 4.7.2. This was building fine on azure devops pipeline.
Now we ported our solution to net5.
Everything is working in visual studio but on azure devops, the pipeline is failing.
We had to change our Microsoft.Interop.Word (and excel, and outlook) to a com reference. Because net5 is multiplatform and interop is not.
Because we removed the nuget packages and changed to com reference the pipeline is failing.
Does anyone know how to handle this specific problem?
We can't remove the interop.excel and etc from our projects because they are dependent on it.
Beneath you see the result we have.
It feels like we have tried everything to make it work again on azure devops.
have you consider self-hosted agent since you have requirement to stay the external library in this case Microsoft.Office.Interops and I don't think Microsoft Azure DevOps Pipeline agent support that currently.
With self-hosted agent, you install the PIAs and link your library/com references to the paths.
There is problem with assembly in the code, your code may be building on local environment as it is getting references for all assemblies however when you checking in the code pipelines could not fetch the assemblies through nuget package restore, if you are referencing assembly from local machine, make sure you add its nuget package reference package.config file, so nuget restore will restore the package
can you try below
Link

confusion on Azure DevOps pipelines

I've recently been working on switching from On premise TFS to Azure DevOps, and trying to learn more about the different pipelines and I think I may have had my Build pipeline do too much.
Currently I have my Build Pipeline do
Get Source code from Repo
Run database scripts/deploy dacpacs
Copy files over to virtual machines that have web application set up already
Run unit/integration tests
Publish the test results
I repeat these steps closely multiple times, one for develop branch, one for current and previous release branch.
But if I want to take advantage of the Releases and Deployments areas what would that really get me?
It looks like it would be easier to say yes this code did make it out to this dev/beta environment.
I'm working with ColdFusion code that includes some .NET webservices within the repo, would I have to make an artifact that zips up the repo and then deploys it, or is there a better way to take advantage of the release pipeline?
It's not necessary to make an artifact that zips up the repo and then deploys it. There are several types of tools you might use in your application lifecycle process to produce or store artifacts. For example, you might use version control systems such as Git or TFVC to store your artifacts. You can configure Azure Pipelines to deploy artifacts from multiple sources. Check the following link for more details:
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/artifacts?view=azure-devops#sources

How can I configure a Azure DevOps Release Pipeline to package PowerShell scripts?

I'm new to Azure DevOps and I'm trying to understand how to package a release of a PowerShell script project I'm working on.
I'm previously familiar with GitHub and the manual process for drafting a new release of my project repo. I'm now experimenting with Azure DevOps and what I want to achieve is a similar output to GitHub where my repo of PowerShell scripts are packaged into a zip file which I can publish for release.
I'm not familiar with the pipeline process in Azure DevOps or YAML as a newbie to proper release cycle tools. Previously I've just created scripts and shared them simply as they are or dropped them into a GitHub repo and manually packaged a release. I'm not likely to be turning out large numbers of builds and so have never had to come at this from an automated standpoint which seems to be the way Azure is driving me unless I'm missing something?
It's pretty simple. I prefer to do this using the old-fashing GUI (hint: there is a link when starting a new Build Pipeline that says Use the classic editor), and then convert to YAML after I get my Build Pipeline working.
1) Create your standard Build Pipeline.
2) Add the step to ZIP your files
3) Add properties to that Archive step. Specify the source to zip and target where you want the zip file to end up at.
4) Lastly, convert that single step to a YAML step by clicking in the upper-right corner on the link View YAML.
There are a lot of steps I am leaving out, but I hope this leads you into the right direction.

Deploy SSDT package via VSTS and cannot Drop objects?

I am successfully deploying an SSDT package in my VSTS release. If publishing directly from Visual Studio, there is a flag "drop_objects_not_in_source" under advanced settings. However, cannot figure out how introduce this in my CI/CD pipeline. There are a bunch of refactoring tools but none address this. So, I am stuck with object on the sql server that have been deleted.
Any suggestions?
Pass the command line argument /p:DropObjectsNotInSource=true when publishing the database or use a publish profile that contains the setting.

Automated Building and Release Management VS2012

Trying to make my life easier, Currently we have 4 developers working in Visual Studio 2012 and we are using TFS 2012 for source control. The project we work on is a multi-tenant web application (single source directory with multiple dbs) that is a mixture of legacy, asp and vb6 com components, coupled with new C# code. We use TFS for source control and for managing User Stories and Bugs. Because of the way our site works it can not be ran or debugged locally only on the server.
Source Control is currently setup with a separate branch for each developer that's working directory is mapped to a shared network path on the dev server that has a web site pointed to it in IIS. Dev01-Dev05 etc. The developers work on projects in their branch test it using their dev website, then check in changes to their own branch and merge those into the trunk. The trunk's work space is mapped to the main dev website so that the developers can test their changes against the other customer's dev domains to test against customizations and variances in functionality based on the specific dbs the are connected to.
Very long explanation but basically each dev has a branch and a site, that are then merged into the trunk with its own site.
In order to deploy our staging server:
I compile the trunk's website via a bat file on the server
Run a windows app I built to query TFS for changesets associated with
specific WorkItems in a certain status, and copy all the files for
those changesets from the publish folder to a deployment folder.
Run another bat file on the server to use RedGate's Deployment Manager
to create a package from those new files
Go to the DM site on our network to create and deploy that release (haven't been able to get the command line tools to work for this, so I have to do it manually)
Run any SQL scripts that have been saved off in Folders that match ticket numbers on each database (10 or so customer dbs) to support the release
I have tried using TFS automated build stuff and never really got it to build the website correctly. Played around with Cruise Control also with little success. Using a mishmash of skunk works projects to do this is very time consuming and unreliable at best.
My perfect scenario would be:
Gated Checkin, Attempt build/publish every time a developer merges into the trunk, rejects and notifies developer if the build fails.
End of the day collect the TFS Items of a certain status and deploys files associated with them to the staging site
Deploy SQL scripts for those TFS items across all the customer dbs in staging
Eventually* run automated regression UI tests, create new WorkItems or emails to devs if failed
Update TFS WorkItems to new state so QA/Customers know their items are ready to test in our staging environment
Send report of what items were deployed successfully
How can I get here so that I am not spending hours preparing and deploying releases to staging and eventually production? Pretty open to potential solutions, things that would be hard to change would be the source control we are using, can't really switch to subversion or something else so we are pretty stuck with TFS.
Thanks
Went back in and started trying to get TFS to build/publish my web solution. I was able to get a build to complete successfully. adding msbuild argument /p:DeployOnBuild=True and setting the msbuild platform to x86 seemed to do the trick on that.
Then I found https://github.com/red-gate/deployment-manager-tfs which gives you a build process template to do the package and deployment using the redgate tools. After playing with that for a bit I finally got it to create, package and deploy my build to our staging environment.
Next up will be to modify the template to run some custom scripts to collect only the correct items to deploy, deploy all the sql files and then to set the workitems to the appropriate statuses after completion.
Really detailed description of your process. Thanks for sharing!
I believe you can set up TFS to have gated check-in on a single branch, which if you can setup on trunk would make sure that the merges built successfully. That could trigger msbuild, if you can get that working or a custom build job.
If you can get that working then you'd be able to use that trunk code as the artifact to send to Deployment Manager. That avoids having to assemble the files for deployment through the TFS change sets, as you'd be confident that the trunk could always build.
Are you using Deployment Manager to deploy the database from source control as well as the application?
That could be a way to further automate the process. SQL Source Control and SQL CI allow you to source control the structure of a database, keep a database up to date on each check-in, and run database unit tests. They also produce database packages for Deployment Manager, so you can deploy a release that contains both the application and the database.
If you want to send me the command you're using in step 4 to deploy the release using Deployment Manager I can help out with that. The commands I use are:
DeploymentManager.exe --create-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1
DeploymentManager.exe --deploy-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1 --deployto=CI-Environment-Name
That will create a release version 1.1 using the latest available packages for that project. You can optionally specify the package to be used when creating the release with
--packageversion=<package name>=<version>
--packageversion="application=1.5