What are my PowerShell options for determining a path that changes with every build (TeamCity)? - powershell

I'm on a project that uses TeamCity for builds.
I have a VM, and have written a PowerShell script that backs up a few files, opens a ZIP artifact that I manually download from TeamCity, and then copies it to my VM.
I'd like to enhance my script by having it retrieve the ZIP artifact (which always has the same name).
The problem is that the download path contains the build number which is always changing. Aside from requesting the download path for the ZIP artifact, I don't really care what it is.
An example artifact path might be:
http://{server}/repository/download/{project}/{build_number}:id/{project}.zip
There is a "Last Successful Build" page in TeamCity that I might be able to obtain the build number from.
What do you think the best way to approach this issue is?
I'm new to TeamCity, but it could also be that the answer is "TeamCity does this - you don't need a PowerShell script." So direction in that regard would be helpful.
At the moment, my PowerShell script does the trick and only takes about 30 seconds to run (which is much faster than my peers that do all of the file copying manually). I'd be happy with just automating the ZIP download so I can "fire and forget" my script and end up with an updated VM.
Seems like the smallest knowledge gap to fill and retrieving changing path info at run-time with PowerShell seems like a pretty decent skill to have.
I might just use C# within PS to collect this info, but I was hoping for a more PS way to do it.
Thanks in advance for your thoughts and advice!
Update: It turns out some other teams had been using Octopus Deploy (https://octopus.com/) for this sort of thing so I'm using that for now - though it actually seems more cumbersome than the PS solution overall since it involves logging into the Octopus server and going through a few steps to kick off a new build manually at this point.
I'm also waiting for the TC administrator to provide a Webhook or something to notify Octopus when a new build is available. Once I have that, the Octopus admin says we should be able to get the deployments to happen automagically.
On the bright side, I do have the build process integrated with Microsoft Teams via a webhook plugin that was available for Octopus. Also, the Developer of Octopus is looking at making a Microsoft Teams connector to simplify this. It's nice to get a notification that the new build is available right in my team chat.

You can try to get your artefact from this url:
http://<ServerUrl>/repository/downloadAll/<BuildId>/.lastSuccessful
Where BuildId is the unique identifier of the build configuration.
My implementation of this question is, in powershell:
#
# GetArtefact.ps1
#
Param(
[Parameter(Mandatory=$false)][string]$TeamcityServer="",
[Parameter(Mandatory=$false)][string]$BuildConfigurationId="",
[Parameter(Mandatory=$false)][string]$LocalPathToSave=""
)
Begin
{
$username = "guest";
$password = "guest";
function Execute-HTTPGetCommand() {
param(
[string] $target = $null
)
$request = [System.Net.WebRequest]::Create($target)
$request.PreAuthenticate = $true
$request.Method = "GET"
$request.Headers.Add("AUTHORIZATION", "Basic");
$request.Accept = "*"
$request.Credentials = New-Object System.Net.NetworkCredential($username, $password)
$response = $request.GetResponse()
$sr = [Io.StreamReader]($response.GetResponseStream())
$file = $sr.ReadToEnd()
return $file;
}
Execute-HTTPGetCommand http://$TeamcityServer/repository/downloadAll/$BuildConfigurationId/.lastSuccessful | Out-File $LocalPathToSave
}
And call this with the appropriate parameters.
EDIT: Note that the current credential I used here was the guest account. You should check if the guest account has the permissions to do this, or specify the appropriate account.

Try constructing the URL to download build artifact using TeamCity REST API.
You can get a permanent link using a wide range of criteria like last successful build or last tagged with a specific tag, etc.
e.g. to get last successful you can use something like:
http://{server}/app/rest/builds/buildType:(id:{build.conf.id}),status:SUCCESS/artifacts/content/{file.name}

TeamCity has the capability to publish its artifacts to a built in NuGet feed. You can then use NuGet to install the created package, not caring about where the artifacts are. Once you do that, you can install with nuget.exe by pointing your source to the NuGet feed URL. Read about how to configure the feed at https://confluence.jetbrains.com/display/TCD10/NuGet.

Read the file content of the path in TEAMCITY_BUILD_PROPERTIES_FILE environment variable.
Locate the teamcity.configuration.properties.file row in the file, iirc the value is backslash encoded.
Read THAT file, and locate the teamcity.serverUrl value, decode it.
Construct the url like this:
{serverurl}/httpAuth/repository/download/{buildtypeid}/.lastSuccessful/file.txt
Here's an example (C#):
https://github.com/WideOrbit/buildtools/blob/master/RunTests.csx#L272

Related

Enumerate secret variables in Azure Pipelines

I have a build step in Azure Pipelines that takes the variables from Azure Pipelines and uploads them somewhere equally secret. Currently I have about 50 builds, and each build has anywhere between 5-20 variables.
Some are secret and some are not. So for non secret ones I enumerate all the set ones and off i go; but for secret ones I need to add them to the build step manually; further, because I am writing them with the same keys i need to:
Declare variable in the group e.g. MyPrefix.MyVar
Edit the build step to say /specialtool --vars=MyPrefix.MyVar=$(MyPrefix.MyVar) which is rather mundane.
I found that I can get a list of variables using the Azure DevOps api, so i thought i could just modify the next build step as the build is running.
However, if I update the same build definition that is currently running (to dynamically write the command), it is not sent to the agent (rather, it feels like all arguments for tasks are captured when the whole build is triggered). Any thoughts on how i can dynamically enumerate secret vars to feed to my tool?
You can use VSTS Logging Commands to update variable value during the build. This will make the updated variable to be available in next build task.
Write-Host "##vso[task.setvariable variable=testvar;]testvalue"
When you create a Typescript custom task (NodeJS based), you can access all the build variables that are available to the build at that point in time through the getVariable api.
This function returns an array of VariableInfo:
/** Snapshot of a variable at the time when getVariables was called. */
export interface VariableInfo {
name: string;
value: string;
secret: boolean;
}
When you create a PowerShell3 custom task, you can access all the build variables that are available to the build at that point in time through the Get-VstsTaskVariable function.
Which returns a similar object structure as the Node version:
New-Object -TypeName psobject -Property #{
Name = $info.Name
Value = Get-TaskVariable -Name $info.Name
Secret = $info.Secret
}
If you need to support TFS 2015 and the 1.x build agents as well, you can use (now deprecated) PowerShell handler and enumerate the secrets using a custom powershell function I describe here.
Each task SDK (Typescript and Powershell), supports a function to set variables as well. Here is an example of setting the variable value in Typescript:
tl.setVariable(variable, value, isSecret);
And on PowerShell3:
Set-VstsTaskVariable -name $VariableName -value $Value -Secret $IsSecret
And on PowerShell (deprecated):
Write-Host "##vso[task.setvariable variable=$($VariableName);issecret=$($IsSecret)]$Value"
My suspicion is that you'd want to create a single task that reads the variables and invokes the command you mentioned in your original post to then post these variables to the other secret store. It's not recommended to read all the secrets and either store them in non-secret variables or to somehow pass them along to the next task.
So I have been looking at a solution for this too. It appears the only way to do this at the moment is to write a custom task. Within a custom task you can get hold of secret values dynamically.
An example is the 'vsts-replacetokens-task' (https://github.com/qetza/vsts-replacetokens-task/blob/master/task/index.ts)
Internally it uses the vsts task library (vsts-task-lib/task)
(https://github.com/Microsoft/azure-pipelines-task-lib/blob/master/node/task.ts)
This vsts task library exposes methods like GetVariables() and GetVariable() etc. which can provide what you need. Unfortunately bit long winded, but the only way that I can see.

Azure Data Factory pipelines are failing when no files available in the source

Currently – we do our data loads from Hadoop on-premise server to SQL DW [ via ADF Staged Copy and DMG on-premise server]. We noticed that ADF pipelines are failing – when there are no files in the Hadoop on-premise server location [ we do not expect our upstreams to send the files everyday and hence its valid scenario to have ZERO files on Hadoop on-premise server location ].
Do you have a solution for this kind of scenario ?
Error message given below
Failed execution Copy activity encountered a user error:
ErrorCode=UserErrorFileNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot
find the 'HDFS' file.
,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The
remote server returned an error: (404) Not Found.,Source=System,'.
Thanks,
Aravind
This Requirement can be solved by using the ADFv2 Metadata Task to check for file existence and then skip the copy activity if the file or folder does not exist:
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity
You can change the File Path Type to Wildcard, add the name of the file and add a "*" at the end of the name or any other place that suits you.
This is a simple way to stop the Pipeline failing when there is no file.
Do you have Input DataSet for your pipeline? See if you can skip your Input Dataset dependency..
Mmmm, this is a tricky one. I'll up vote the question I think.
Couple of options that I can think of here...
1) I would suggest the best way would be to create a custom activity ahead of the copying to check the source directory first. This could handle the behaviour if there isn't a file present, rather than just throwing an error. You could then code this to be a little more graceful when it returns and not block the downstream ADF activities.
2) Use some PowerShell to inspect the ADF activity for the missing file error. Then simply set the dataset slice to either skipped or ready using the cmdlet to override the status.
For example:
Set-AzureRmDataFactorySliceStatus `
-ResourceGroupName $ResourceGroup `
-DataFactoryName $ADFName.DataFactoryName `
-DatasetName $Dataset.OutputDatasets `
-StartDateTime $Dataset.WindowStart `
-EndDateTime $Dataset.WindowEnd `
-Status "Ready" `
-UpdateType "Individual"
This of course isn't ideal, but would be quicker to develop than a custom activity using Azure Automation.
Hope this helps.
I know i'm late to the party, but if you're like me and running into this issue, looks like they made an update a while back to allow for no files found

Jenkins Powershell Output

I would like to capture the output of some variables to be used elsewhere in the job using Jenkins Powershell plugin.
Is this possible?
My goal is to build the latest tag somehow and the powershell script was meant to achieve that, outputing to a text file would not help and environment variables can't be used because the process is seemingly forked unfortunately
Besides EnvInject the another common approach for sharing data between build steps is to store results in files located at job workspace.
The idea is to skip using environment variables altogether and just write/read files.
It seems that the only solution is to combine with EnvInject plugin. You can create a text file with key value pairs from powershell then export them into the build using the EnvInject plugin.
You should make the workspace persistant for this job , then you can save the data you need to file. Other jobs can then access this persistant workspace or use it as their own as long as they are on the same node.
Another option would be to use jenkins built in artifact retention, at the end of the jobs configure page there will be an option to retain files specified by a match (e.g *.xml or last_build_number). These are then given a specific address that can be used by other jobs regardless of which node they are on , the address can be on the master or the node IIRC.
For the simple case of wanting to read a single object from Powershell you can convert it to a JSON string in Powershell and then convert it back in Groovy. Here's an example:
def pathsJSON = powershell(returnStdout: true, script: "ConvertTo-Json ((Get-ChildItem -Path *.txt) | select -Property Name)");
def paths = [];
if(pathsJSON != '') {
paths = readJSON text: pathsJSON
}

How to pass a parameter to Chef recipe from external source

I'm new to Chef and seeking help here. I'm looking into using Chef to deploy our builds to Chef node servers (Windows Server 2012 machines). I have a cookbook called copy_builds that goes out to a central repository and selects the build we want to deploy and copies it out to the node server. The recipe I have contains basic steps that perform the copy steps, and this recipe could be used for all builds we want to deploy except for one thing: the build name.
Here is an example of the recipe:
powershell_script 'Copy build files' do
code '
$Project = "Dev3_SomeCoolBuild"
net use "\\\\server\\build_share\\drop\\$Project"
$BuildNum = GC "\\\\server\\build_share\\drop\\$Project\\buildlabel.txt"
robocopy \\\\server\\build_share\\drop\\$Project\\bin W:\\binroot\\$BuildNum'
end
As you can see, the variable $Project contains the name of the build in this recipe. If we have 100 different builds, all with different names, then what is the best way to handle this without creating 100 different recipes for my copy_builds cookbook?
BTW: this is how I'm currently calling Chef to deploy, which is in a PowerShell script that's external to Chef:
knife node run_list set $Node "recipe[copy_builds::$ProjectName],recipe[install_build]"
This command (from the external PowerShell script) contains the project/build name info within it's own $ProjectName variable. In this case $ProjectName contains the value of 'Dev3_SomeCoolBuild', to reference the recipe Dev3_SomeCoolBuild.rb.
What I'd like is have just one default recipe under copy_builds cookbook, and pass in the build/project name. Is this possible? And what is the best way to do it? I've read about data bags, attributes, and providers, but not sure if they would work for what I want.
Please advise.
Thanks,
Keith
The best approach for you is likely to use a single recipe that gets a list of projects to deploy from a databag or node attributes (or both). So basically take what you have now and put it in a loop, and then use either roles to set node attributes or put the project mapping into a databag item.
I ended up using attributes here to solve my problem. I updated my script to write the build name to the attributes/default.rb file for the copy_builds recipe and upload the cookbook to Chef each time a deployment is run.
My recipe now includes a call to the attributes file to get the build name, like so:
powershell_script 'Copy build files' do
code <<-EOH
$BuildNum = GC \\\\hqfas302002c\\build_share\\drop\\"#{node['copy_builds']['build']}"\\buildlabel.txt
robocopy \\\\hqfas302002c\\build_share\\drop\\"#{node['copy_builds']['build']}"\\webbin W:\\binroot\\$BuildNum /E
EOH
end
And now my call to Chef looks like this:
knife node run_list set $Node "recipe[copy_builds],recipe[install_build]"

Accessing Teamcity Artifact with Dynamic name

I want to download a TeamCity artifact via powershell. It needs to be the last successful build of a specific branch.
I've noticed two common url paths to access the artifacts. One seems to be
/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/ARTIFACT_PATH
The problem is that the file at the end relies on the release version. Within TeamCity there is syntax to specify all files \*.msi. Is there any way to specify an artifact starting with FileName-{version.number}.msi when trying to access this url?
EDIT:
The other url I noticed is for the REST API.
http://teamcity/guestAuth/app/rest/builds/branch:[BRANCH],buildType:[BUILD TYPE],status:SUCCESS,state:finished/artifacts/[BUILD PATH]
The problem is that I can't download artifacts from here. If I want to download the artifacts I have to use the current build id. The above url gives the following url: /guestAuth/app/rest/builds/id:[Build ID]/artifacts/content/[Artifact Path] to download the artifact.
I can use the first REST url to eventually get the second through the returned xml, but would prefer a more straightforward approach.
Unfortunately as TeamCity artifacts are not browsable the usual workarounds like wget recursive download or wildcards are not applicable.
Using wildcards in wget or curl query
How do I use Wget to download all Images into a single Folder
One workaround you could try is formatting the link in the job, saving the link to a text file and storing that as an artifact as well, with a static name. Then you just need to download that text file to get the link.
I found you can format the artifact URL in TeamCity job by doing:
%teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/<path_to_artifact>
In a command line step. You can write this to a file by doing:
echo %teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/myMsi-1.2.3.4.msi > msiLink.txt"
Now you have an artifact with a constant name, that points to the installer (or other artifact) with the changing name.
If you use the artifact msiLink.txt you don't need to use the REST interface (it's still two calls, both through the same interface).
You can easy download the latest version from batch/cmd by using:
wget <url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt ---user #### --passsword ####
set /P msi_url=<msiLink.txt
wget %msi_url% --user #### --passsword ####
Hope it helps.
Update:
Sorry I just realized the question asked for PowerShell:
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential("yourUser", "yourPassword")
$WebClient.DownloadFile( "<url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt", "msiLink.txt" )
$msi_link = [IO.File]::ReadAllText(".\msiLink.txt")
$WebClient.DownloadFile( $msi_link, "yourPath.msi" )