How do I manually deploy a specific TFS changeset using TeamCity? - deployment

I have the following setup in TeamCity:
A) CI upon check-in with build, tests and deploy to test environment.
B) Manual deploy
Deployment is based on MSBuild build task with MsDeployServiceUrl parameters.
My CVS is TFS and im not using any build scripts.
A is working great, what I can't figure out is B.
Now, what I would like B to do is, when triggered (manually), get the source from TFS as a changeset based on the latest pinned build of A, build the source (Using build step MSBuild with /p:Configuration=Release) and run the deploy to a production server.
My question here is, how do I get source from a specific changeset from TFS based on the build id of last pinned build of A? Is that even possible?
I have read the documentation on custom builds and artifacts: http://confluence.jetbrains.net/display/TCD65/Triggering+a+Custom+Build
But could not figure out how to use it in my scenario and I am not even sure if this is the road to take on specific need.
Any ideas on how to do this would be greatly appreciated.

Ok, I figured it out.
You need to set "Artifacts path" in build configuration A to something like:
/**/* => Src
And then in build configuration B set "Artifact Dependencies" to something this:
"Artifacts path" = Src/**/*
"Destination path" = Builds/Release
And finally use this "Destination path" as a prefix when adding the "Build file path" in the MSBuild "Build step" on B.
Works like a charm every build! :-)

Related

Dependencies between BuildConfigurations in TeamCity when deploying

I'm having a hard time figuring out how to correctly deploy to different environments with TeamCity (in terms of cross BuildConfiguration dependencies) and hope to get some input as to how to configure my SubProjects/BuildConfigurations properly. Lets start based on a concrete example: I made this test "TeamcityConfigurationTests" to better learn how TeamCity handled dependencies, and the current state shows the result i am looking for:
I have 3 subProjects, Dev, Test and Prod - and all associated tasks for those "environments" as seperate build configurations within that subProject. This is to more clearly visualize what is going on, and if anything breaks, to be able to see immediately what is broken (separate Build, UnitTest and DeployToDev BuildConfigurations, rather than 3 different steps in one single Build Configuration).
Ideally, i only want to build my application once in the Dev.Build step, and let the Dev.UnitTest and Dev.DeployToDev steps grab that artifact and run tests and deploy. That i got going for me, by having snapshot and artifact dependencies. But i am having trouble getting the correct artifact when i want to deploy from Dev -> Test or Test -> Prod.
My issue is to correctly reference the latest successfully DEV deployed artifact when running Test.DeployToTest - and the same for getting the latest successfully TEST deployed artifact when running Prod.DeployToProd. (Essentially i want to promote the artifact to the next environment).
Now, my issue is, if I in the Test.DeployToTest have a SnapshotDependency to Dev.DeployToDev and an artifact dependency to Dev.Build, and the VCS source has changed since Deploy to Dev has run, it triggeres running all the DEV steps again. This is not the worst part, the same happens when i run Prod.DeployToProd if the VCS source changed since the initial build on dev (because of all the snapshot dependencies). Meaning, that rather than promoting Test -> Prod, I Build and deploy whatever is currently on VCS to Dev, Test AND Prod.
How am i supposed to set this up correctly?
The only other option i am aware of, is letting Dev.DeployToDev also publish the same artifact, and only have an (LatestSuccessful) ArtifactDependency in Test.DeployToTest. I would also have to publish the artifact again in Test.DeployToTest, for letting Prod.DeployToProd only have a (LatestSuccessfull) artifact dependency to Test.DeployToTest. (This would be to get rid of the SnapshotDependencies causing previous environments to run build/deploy again in case of VCS changes). But then i am publishing the artifact 3 times, rather than just the one time when the application is originally built in DEV - which i would like to avoid. Also, i have cases where no artifact is needed for deploying to Test and Prod, so there is no artifact to depend on (essentially i only need the BuildNumber from the "Dependent" environment i want to promote from).
I hope for some input. Thank you
Regards
Frederik
For anyone wondering, i made a JetBrains support ticket, and got the following response:
Basically, there are options to resolve your case:
Option 1: use "Promote" action form the build's Actions top-right menu
(or change the type of the Deploy* configurations to deployment and
use the action from the block on the build results. This is the
preferred way: before deploying you select the build to deploy and
"promote" it to the next environment. There is also an experimental
hidden feature to hide the "Run" button: add
"teamcity.ui.runButton.caption" configuration parameter in the build
configurations to empty value.
Option 2: do not use snapshot dependency, use only artifact dependency
on the latest successful build. However, when you run the build you
cannot be sure that the last successful build you see will be
deployed: while the build is standing int he queue, another
Dev.DeployToDev can finish and then be deployed as the last
successful.
We went with option 1

Deploying from Appveyor to Nuget only on changes in a particular folder

I have a .NET Core project that is auto-built in Appveyor and deployed to Nuget. By default, every successful build causes a new Nuget release.
However, there are many cases when a new release is meaningless because the library's actual code has not changed:
Readme updated
Unit tests added
Appveyor configuration changed
Other cases
It is possible to configure the build so that Nuget publishing only runs if there are changes in the actual code (for example, in folder X)?
There are a few options.
Commit filtering. Note that with it the whole build, not just deployment will be skipped if nothing in folder x changed. You may need a build without deployment at least when unit tests added. As a workaround consider adding separate AppVeyor project which will build and deploy only if folder x changed and keep current project to build every time, but not deploy
Inspect changed files with script. Please check this sample on how to check those files if you use GitHub. So if you see that files in folder x changed, you can set some custom environment variable (lets say you call it deploy_nuget) to true, and use it with a conditional deployment.

Copying files and deploying to Azure without building using Visual Studio Team Services

I'm attempting to deploy a web site to Azure using VSTS. Basically, I commit code to the GIT repo and have it setup to run CI, so it begins building as soon as I commit. However, once it hits the release section, it never copies the code to the Azure web app, rather, it gives me this line:
Info: Updating file ({projectname}\error.txt).
It doesn't copy the files I changed, but rather always just copies this file. I checked and there is indeed an error.txt file in my website directory in Azure, but it is always blank.
This build/deploy process isn't "standard" because the build step only downloads from source code, it doesn't build, because the website isn't a "web application", but rather just a "web site", meaning it doesn't need to be built.
So my build step is as follows:
Get Sources
Run on Agent - this step is empty
so the idea is that it just downloads everything from source control, that's it.
Then, my release step is as follows:
Artefacts are from build step above
deploy to environment 1 (dev)
Azure app service deploy, using "package or folder" as $(System.DefaultWorkingDirectory)/
Any idea what I might be doing wrong here?
So I actually figured this out and will leave this here in case anyone else needs it.
I admit I'm pretty new to the Azure/VSTS world, so maybe someone else is making my mistake as well.
If you don't need to "build" your project, then don't. I resolved it by simply skipping the build step altogether. What I was really after was to just download the files from source control and deploy them as-is.
In your release editor, you can specify which "artifact" you want to use to release, and one of the options is source control, which is what I did.
This would be useful for websites like mine where you don't need to build them (mine is DNN/DotNetNuke, so you don't build it before deploying).

How to trigger a particular build before a commit is made to other builds in vsts

I have a build say "test" under build and releases which is common build or dependent build to other builds like "test1" and "test2". I want the "test" build to run first before every commit is made to other builds "test1" and "test2" so that they can get the required dependencies form "test". Can you please let me know how this is achievable in vsts?
Regards,
Shwetha
You can do it through Phase, simple steps:
If you are using TFVC, change Source to add common/dependent project mapping in Get sources.
If you are using Git: Select the Phase and check All Script to access OAuth token option. (Optional: Select Options tab and change Project collection in Build job authorization scope if the common/dependent project is in another team project)
Change Visual Studio build task to just build common/denpendent project in this phase
Add another agent phase to build other projects (e.g. test1, test 2)
You may try using an extension like Build Chain or Queue New Build or Parallel Builds. These extensions allows you to launch and wait for other builds in a Gated-checking Definition for test1 and test2 projects.

How to get Jenkins to report after the build script fails?

I use rake to build my project and one of the steps is running the unit, integration and fitnesse tests. If too many of these fail, I fail the rake script.
That part is working fine.
Unfortunately, after the build is failed, jenkins doesn't publish the html reports I generated from the unit, integration and fitnesse tests I generated, making it tad difficult to track down the failure reason.
Am I missing a configuration step to get the reports published?
Is Jenkins supposed to skip the post-build steps when the build fails?
It seems like it some for most of the plugins I am using.
You have to tell Jenkins which artifacts to archive in a post-build step (there is a check box under general 'Post-build actions' heading which is called 'Archive the Artifacts'). Important: the artifact path is determined relative to the workspace directory. Make sure that the option Discard all but the last successful/stable artifact to save disk space is not checked.
Finally figured it out, one of those I could have had a V8 moments...
I'm using a rake file to build and one of it's tasks is failing just before some reporting tasks that need to run in order to have the HTML pushed into the correct area to be published.