TFS Build vNext - start process after successful build - powershell

On our build server, we have a running process that, among other things, are necessary for running integration tests, to gain access to a database.
This piece of software can be changed in our repository, and I have therefore created a CI build that builds the changes. What I would then like to do, is to restart the process with the new compiled version upon successful build, but that part I can not seem to get to work.
I have no issue killing the running process, and deploying the result to specific location on the build server, but seemingly no matter what I try, the process I spawn is killed as soon as the running build ends.
I have tried the following:
With PowerShell
Start-Process <path to file>
With CMD
with 'cmd' as tool and the following arguments:
<path to file>
start <path to file>
/c start <path to file>
cmd <path to file>
cmd start <path to file>
cmd /c start <path to file>
I have also tried simply supplying the .exe path as the tool name, and no arguments, no luck either.
How well did it work?
Well, for most of the above approaches, an extra step with the PS command Get-Process <exe name>* I got the result that the process was running. The same step yielded no results after a step that stopped the process, so a new updated one could be started. So I am positive it works, vNext build simply kills it all after the build ends.
Other solutions
I have got 2 solutions left that I can think of should work, but both solutions are too complicated for my liking, in that I am introducing quite the complexity to the build process that then might go wrong, but here goes:
Setup the build server as a valid target for deployments. Then I am guessing that I can use the "PowerShell on Target Machines" step, even though I am targeting itself. I would assume that it would then act as separate processes. That requires all sorts of configurations to get that in place though, plus writing the remote PowerShell script for the task.
Writing a small windows service that is callable with as e.g. REST, which could then start the process, so the new windows service becomes the owning thread. - that just introduces a new layer that also might need to be updated. This could then perhaps be updated manually instead of automatic.
Again, I would rather not use any of the 2 solutions, if a better exists. :)

At the moment I ended up fixing it by installing AutoIt, and added this simple script to the "deploy" folder:
While 1 ; Opens up a WHILE loop, with 1 as a constant, so it is infinite
If Not ProcessExists("DelphiDebug.exe") Then Run("DelphiDebug.exe") ; if the process of DelphiDebug.exe doesn't exist, it starts it
Sleep (10) ; Puts the script to sleep for 10 milliseconds so it doesn't chew CPU power
WEnd ; Closes the loop, tells it to go back to the beginning
Credit goes to this forum entry where a notepad example was made.
I have then made a PowerShell script that renames the current version, copies the newly build version to "deploy" folder, stops the running instance, and deletes the renamed version:
# CONSTANTS AND CALCULATED VARIABLES
[string]$deploymentFolder = 'C:\Deployment-folder-on-local-machine'
[string]$deploymentPath = "$deploymentFolder\my-program.exe"
[string]$searchPathExistingVersion = $deploymentPath + '*' # The star is important so Get-Item does not throw an error
[string]$toDeleteExeName = 'please-delete.exe'
[string]$newCompiledVersionFullPath = "$env:BUILD_SOURCESDIRECTORY\sub-path-to-bin-folder\my-program.exe"
[string]$processName = 'my-program'
[string]$searchPathToDeleteVersion = "$deploymentFolder\$toDeleteExeName*" # The star is important so Get-Item does not throw an error
# EXECUTION
Write-Verbose "Search path: $searchPathExistingVersion"
$existingVersion = Get-Item $searchPathExistingVersion;
if ($existingVersion) {
Write-Debug "Match found: $existingVersion"
Rename-Item $existingVersion.FullName $toDeleteExeName
} else {
Write-Debug 'No existing file found'
}
Write-Verbose "Copy new version from path: $newCompiledVersionFullPath"
Copy-Item $newCompiledVersionFullPath $deploymentPath
Write-Verbose 'Stopping running processes'
Get-Process ($processName + '*') | Stop-Process # The new version is auto started with a running AutoIt script in the deployment folder.
Write-Verbose "Deleting old versions with search path: $searchPathToDeleteVersion"
$toDeleteVersion = Get-Item $searchPathToDeleteVersion
if ($toDeleteVersion) {
Write-Debug "Match found: $toDeleteVersion"
Remove-Item $toDeleteVersion -ErrorAction SilentlyContinue # Deletion is not critical. Next time the build runs, it will attempt another cleanup.
} else {
Write-Debug 'No file found to delete'
}
Again, this solution added another complexity to the server, so I am keeping the question open for a few days, to see if there are simpler solutions. :)

Instead of starting a process during the build I think you need to install the process as a Windows Service and start it. It makes sense that any process you start in the context of the build would stop at the completion of the build. You can use the commandline to install/update/start a Windows service:
sc create [service name] [binPath= ]
https://ozansafi.wordpress.com/2009/01/14/create-delete-start-stop-a-service-from-command-line/

Related

Trying to get a Powershell Script that will run in a 2nd window and monitor in real time other running scripts / report all Errors / ExitCodes

I am fairly new to writing code in Powershell. For my job I have to write multiple Powershell scripts to make changes in the Hardware and Software settings as well as the Registry and Group Policy Editor to get these applications to run. These applications are a little older. Upgrading these software applications or the hardware then run on is NOT an option. as an example, when Microsoft releases the new patches on like Patch Tuesday...when those patches are applied there is a high probability that something will be changed which is where I come in to write a script to fix the issue. I have multiple scripts that I run. When those scripts are ran they may end up terminating because of an Error Code or an Exit Code. A large part of the time I do not that the script has failed immediately.
I am trying to figure out a script that I can run in a 2nd PowerShell Console Window. I am thinking that the only purpose of this script is to just sit there on the screen and wait and monitor. Then when I execute a script or Application (the only file extensions that I am worried about are: EXE, BAT, CMD, PS1) if the script/application that I just ran ends with an exit code or an error code....then output that to the screen...in REAL TIME.
Below, I have a small piece of code that kind of works, but it is not what I am wanting.
I have researched online and read and read tons of stuff. But I just can't seem to find what I am looking for.
Could someone please help me with getting a script that will do what I am wanting.
Thank you for your help!!!!
$ExitErrorCode =
"C:\ThisFolder\ThatFolder\AnotherFolder\SomeApplication.EXE # (this
would
# either be an EXE or CMD or BAT or PS1)"
$proc = Start-Process $ExitErrorCode -PassThru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
if ($proc.ExitCode -ne 0) {
Write-Warning "$_ exited with status code $($proc.ExitCode)"
}
Possible duplicate of the approaches shown here:
Monitoring jobs in a PowerShell session from another PowerShell session
Monitoring jobs in a PowerShell session from another PowerShell session
PowerShell script to monitor a log and output progress to another
PowerShell script to monitor a log and output progress to another

VSTS build definition - prevent PowerShell exit behavior causing processes termination

I have a PowerShell task in my definition that calls another script file on its own which takes care of running several things on my build agent (starts several different processes) - emulators, node.js applications, etc.
Everything is fine up until the moment this step is done and the run continues. All of the above mentioned stuff gets closed with most of the underlying processes killed, thus, any further execution (e.g. tests run) is doomed to fail.
My assumption is that these processes are somehow dependent on the outermost (temporary) script that VSTS generates to process the step.
I tried with the -NoExit switch specified in the arguments list of my script, but to no avail. I've also read somewhere a suggestion to set this by default with a registry key for powershell.exe - still nothing.
The very same workflow was okay in Jenkins. How can I fix this?
These are the tasks I have:
The last PowerShell task calls a specified PowerShell file which calls several others on its own. They ensure some local dependencies and processes needed to start executing the tests, e.g. a running Node.js application (started in a separate console for example and running fine).
When the task is done and it is successful, the last one with the tests would fail because the Node.js application has been shut down as well as anything else that was started within the previous step. It just stops everything. That's why I'm currently running the tests within the same task itself until I find out how to overcome this behavior.
I am not sure how you call the dependencies and applications in your PowerShell script. But I tried with the following command in PowerShell script task to run a Node.js application:
invoke-expression 'cmd /c start powershell -Command {node main.js}'
The application keeps running after the PowerShell script task is passed and finished which should meet your requirement. Refer to this question for details: PowerShell launch script in new instance.
But you need to remember to close the process after the test is finished.
There is the Continue on error option (Control Options section). The build process will be continued if it is true (checked), but the build result will be partially succeeded.
You also can output the error or warning by using PowerShell or VSTS task commands (uncheck Fail on Standard Error option in the Advanced section) and terminate the current PowerShell process by using the exit keyword, for example:
Write-Warning “warning”
Write-Error “error”
Write-Host " ##vso[task.logissue type=warning;]this is the warning"
Write-Host " ##vso[task.logissue type=error;sourcepath=consoleapp/main.cs;linenumber=1;columnnumber=1;code=100;]this is an error "
More information about the VSTS task command, you can refer to: Logging Commands

Azure Cloud Service Startup task that needs to run a PowerShell script

All,
Note: I have updated the question after some feedback.
Thanks to #jisaak for his help so far.
I have the need to run a PowerShell script that adds TCP bindings and some other stuff when I deploy my Cloud Service.
Here is my Cloud Service Project:
Here is my Cloud Service Project and Webrole project:
Here is my task in ServiceDefinition.csdef:
And here is the PowerShell script I want to run:
here is my attempt at the Startup.cmd:
When I deploy I get this in the Azure log:
And this in the powershell log:
Any help would be very much appreciated.
I think I am nearly there but following other people syntax on the web doesn't seem to get me there.
thanks
Russ
I think the issue is that the working directory of the batch command interpreter when it runs Startup.cmd runs is not as expected.
The Startup.cmd is located in the \approot\bin\Startup directory but the working directory is \approot\bin.
Therefore the command .\RoleStartup.ps1 is not able to find the RoleStartup.ps1 as it is looking in the bin directory not in the bin\Startup directory.
Solutions I know to this are:
Solution 1:
Use ..\Startup\RoleStartup.ps1 to call the RoleStartup.ps1 from Startup.cmd.
Soltuion 2:
Change the current working directory in Startup.cmd so that the relative path .\RoleStartup.ps1 is found. I do this by CHDIR %~dp0 (see here) to change into the directory that contains Startup.cmd.
Solution 3:
As Don Lockhart's answer suggested, do not copy the Startup directory to the output, instead leave it set as "Content" in the Visual Studio project. This means the files within it will exist in the \approot\Startup directory on the Azure instance. (You would then want to make sure that the Startup folder is not publically accessible via IIS!). Then update the reference to Startup.cmd in ServiceDefinition.csdef to ..\Startup\Startup.cmd, and update the reference to RoleStartup.ps1 in Startup.cmd to ..\Startup\RoleStartup.ps1. This works on the fact that the working directory is bin and uses ..\Startup to always locate the Startup directory relative to it.
You don't need to set the executionpolicy within your cmd - just call the script. Also, you should use a relative path because you can't rely that there is C disk.
Change your batch to:
powershell -executionpolicy unrestricted -file .\RoleStartup.ps1
Right click on the RoleStartup.ps1 and Startup.cmdin Visual Studio and ensure that the Copy to Output directory is set to copy always.
If this still doesn't work, remove the startup call in your csdef, deploy the service, rdp into it and try to invoke the script by yourself to retrieve any errors.
Edit:
Try to adopt your script as below:
Import-Module WebAdministration
$site = $null
do # gets the first website until the result is not $null
{
$site = Get-WebSite | select -first 1
Sleep 1
}
until ($site)
# get the appcmd path
$appcmd = Join-Path ([System.Environment]::GetFolderPath('System')) 'inetsrv\appcmd.exe'
# ensure the appcmd.exe is present
if (-not (Test-Path $appcmd))
{
throw "appcmd.exe not found in '$appcmd'"
}
# The rest of your script ....
I've found it easier in the past to not copy the content to the output directory. I have approot\bin as the working directory. My startUp task element's commandLine attribute uses a relative reference to the .cmd file like so:
The .cmd file references the PowerShell script relatively from the working directory as well:
PowerShell -ExecutionPolicy Unrestricted -f ..\StartUp\RoleStartup.ps1
Ok,
So I am coming back to this after many different attempts to make it work.
I have tried using:
Startup config in the ServiceDefinition.csdef
I have tried registering a scheduled task on the server that scans the Windows Azure log looking for [System[Provider[#Name='Windows Azure Runtime 2.6.0.0'] and EventID=10004]]
Nothing worked either due to security or the timing of events and IIS not being fully setup yet.
So I finally bit the bullet and used my Webrole.cs => public override bool OnStart() method:
Combined with this in the ServiceDefinition.csdef:
Now it all works. This was not the most satisfying result as some of the other ways to do it felt more elegant. Also, many others posted that they got the other ways of doing it to work. Maybe I would have got there eventually but my time was restricted.
thanks
Russ

How do I detect if the test run was successful in a Team Build 2013 Post-Test script?

I have a build configuration in TFS 2013 that produces versioned build artifacts. This build uses the out of the box process template workflow. I want to destroy the build artifacts in the event that unit tests fail leaving only the log files. I have a Post-Test powershell script. How do I detect the test failure in this script?
Here is the relevant cleanup method in my post-test script:
function Clean-Files($dir){
if (Test-Path -path $dir) { rmdir $dir\* -recurse -force -exclude logs,"$NewVersion" }
if(0 -eq 1) { rmdir $dir\* -recurse -force -exclude logs }
}
Clean-Files "$Env:TF_BUILD_BINARIESDIRECTORY\"
How do I tests for test success in the function?
(Updated based on more information)
The way to do this is to use environment variables and read them in your PowerShell script. Unfortunately the powershell scripts are run in a new process each time so you can't rely on the environment variables being populated.
That said, there is a workaround so you can still get those values. It involves calling a small utility at the start of your powershell script as described in this blog post: http://blogs.msmvps.com/vstsblog/2014/05/20/getting-the-compile-and-test-status-as-environment-variables-when-extending-tf-build-using-scripts/
This isn't a direct answer, but... We just set the retention policy to only keep x number of builds. If tests fail, the artifacts aren't pushed out to the next step.
With our Jenkins setup, it wipes the artifacts every new build anyway, so that isn't a problem. Only the passing builds fire the step to push the artifacts to the Octopus NuGet server.
The simplest possible way (without customizing the build template, etc.) is do something like this in your post-test script:
$testRunSucceeded = (sqlcmd -S .\sqlexpress -U sqlloginname -P passw0rd -d Tfs_DefaultCollection -Q "select State from tbl_TestRun where BuildNumber='$Env:TF_BUILD_BUILDURI'" -h-1)[0].Trim() -eq "3"
Let's pull this apart:
sqlcmd.exe is required; it's installed with SQL Server and is in the path by default. If you're doing builds on a machine without SQL Server, install the Command Line Utilities for SQL Server.
-S parameter is server + instance name of your TFS server, e.g. "sqlexpress" instance on the local machine
Either use a SQL login name/password combo like my example, or give the TFS build account an account on SQL Server (preferred). Grant the account read-only access to the TFS instance database.
The TFS instance database is named something like "Tfs_DefaultCollection".
The "-h-1" part at the end of the sqlcmd statement tells sqlcmd to output the results of the query without headers; the [0] selects the first result; Trim() is required to remove leading spaces; State of "3" indicates all tests passed.
Maybe someday Microsoft will publish a nice REST API that will offer access to test run/result info. Don't hold your breath though -- I've been waiting six years so far. In the meantime, hitting up the TFS DB directly is a safe and reliable way to do it.
Hope this is of some use.

MSBuild fails when run via Powershell

I have an MSBuild script that I want to call from a PowerShell script as part of a deployment process. If I call the build script via a bat file all works well. If I do the exact same thing in PowerShell I get CS1668 error looking for wierd and wonderful paths that dont exist on my machine.
I know I am getting into the MSBuild script as these errors are occuring from within the MSBuild script targets (logging output is showing this).
The bat file and the PowerShell script reside in the same place, right next to the build script.
Errors:
error CS1668 : Warning as error :
Invalid search path 'C:\Program
Files\Microsoft Visual Studio
8\VC\AtlMfc\Lib' specified in 'LIB
environment variable' -- 'The system
cannot find the path specified. '
error CS1668 : Warning as error :
Invalid search path 'C:\Program
Files\Microsoft Visual Studio
8\VC\PlatformSDK\Lib' specified in
'LIB environment variable'-- 'The
system cannot find the path specified.
'
I have checked and neiter of these paths do NOT exist on my machine.
Why would running from PowerShell change what paths MSBuild look for?
Thanks in advance
RhysC
EDIT- adding in code: NB the name of the MSBuild script is AutomatedDebug.build
POWERSHELL SCRIPT:
#Begining of script
$CurrentPath = Split-Path (Split-Path $myinvocation.mycommand.path )
#Assign the Build script paths. 1 is for building and testsing, 1 is for deployment (to keep things clean)
$MSBuildScriptBuildAndTestPath = $CurrentPath + "\Tools\AutomatedDebug.build"
$MSBuildScriptDeployPath = $CurrentPath + "\Tools\Deploy.build"
#Run the automated build with tests
Write-Host "*** Run the automated build with tests ***"
C:\Windows\Microsoft.NET\Framework\v3.5\MSbuild.exe $MSBuildScriptBuildAndTestPath "/t:AllTests" "/l:FileLogger,Microsoft.Build.Engine;logfile=AllTests.log"
if($LastExitCode -ne 0)
{
throw "AllTests failed"
}
Write-Host "*** FINISHED: Run the automated build with tests ***"
Bat File
#C:\Windows\Microsoft.NET\Framework\v3.5\MSbuild.exe AutomatedDebug.build /t:AllTests /l:FileLogger,Microsoft.Build.Engine;logfile="AllTests.log"
#pause
enter code here
Ok, what is actually happening is that I have dowloaded the Powershell Community Extensions (http://pscx.codeplex.com) many moons ago and there is a Environment.VisualStudio2005.ps1 file in there that adds EnvVars that dont exist. This is (I assume) becuase i dont use VS2005. I have commmented this out and all is well. I have tried to look for the 2008 equivilent but dint have too much luck. To be honest i dont use powershell to its full potential so i doubt i even need the PSCX on my machine anyway.
Thanks guys for your help.
It is most likely due to how you are passing parameters into the MSBuild script. There are some key parameter parsing differences between CMD and PowerShell and by using a batch file, you are implicitly using CMD's parameter parsing engine to hand off the parameters to MSBuild. If you post the command line you're using, one of the PowerShell gurus here can probably help debug where the parameters are getting confused.
I am wondering, basically you get "Warning as error" here. You can either try turning off treating warnings as errors (I mean, having a lib path which doesn't exist doesn't sound that bad to me) or you'll let your scripts print out the respective environment variables (using echo %LIB% and $Env:LIB respectively) from the scripts and look for differences.
However, since the envvars for VS aren't set globally by default (that's why there are the three Visual Studio command prompts) you have to set all variables somewhere. If you're doing this within the scripts you may have a typo somewhere?