Psake - Running same task multiple times - deployment

I'm creating a build/deploy script for our website. Our process currently is a bit convoluted as it requires to start website with a webconfig setup to update our schema. Then we switch off the site, change webconfig not to update schema anymore and we start it again.
Because of repetitiveness of tasks our -taskList couple of tasks more than once.
eg.
Invoke-Psake (Join-Path $env:currentDir "\tasks\iis_app_deploy.ps1") `
-taskList ValidateProperties,StopApplicationPool,<random stuff>,StartApplicationPool,StopApplicationPool,StartApplicationPool,PutBackInLoadBalancer
In this tasks list, each tasks gets excuted once and once only. Is there any way of specifying psake to run tasks without checking whether they have ran before or not?

I have found an answer to my own question. Nesting.
Those were the steps;
Pull deployment process into smaller steps (in my case two)
Create a wrapper Task
Use Invoke-Psake in those to get it to work.
This is an example code;
task BackendDeployment -depends ValidateProperties{
# Deploy and update schema
$self = Join-Path ($env:scriptPath) "tasks\iis_app_deploy.ps1"
Write-Output "Running schema changes"
Invoke-Psake $self -TaskList StopApplicationPool,`
MSDeploy,`
CopyLicenses,`
CopyConfigs,`
UpdateConfigForSchemaChanges,`
StartApplicationPool,`
WarmUpApplications, `
WaitForAction `
-properties $properties
Write-Output "Running final deployment"
Invoke-Psake $self -TaskList StopApplicationPool,`
CopyConfigs,`
StartApplicationPool,`
WarmUpApplications `
-properties $properties
}

Related

Powershell - get string information to a console for user from one script to another

I have 2 scripts, one serves as a Worker - uses variables to complete commands, second serves as a function Library. Worker via variable loads Library and uses functions from there.
As a user, when I run the script I would like to see the output in the console which I defined as the outcome for Library script.
Example Worker script:
Param(
[string]$server_01,
[string]$releaseDefinitionName,
[string]$pathRelease,
[string]$buildNumber,
[string]$command_01,
[string]$scheduledTask_01
)
$pathScriptLibrary = $pathRelease + "\" + $buildNumber + "\" + "_scripts"
. $pathScriptLibrary\_library.ps1
$user = xxx
$password = xxx
$cred = xxx
Invoke-Command -ComputerName $server_01 -Credential $cred -ErrorAction Stop -ScriptBlock {powershell $command_01}
Example Library script:
function Stop-ScheduledTasks{
Write-Output [INFO]: Stopping scheduled tasks... -ForegroundColor White
Get-ScheduledTask -TaskName "$scheduledTask_01" | ForEach {
if ($_.State -eq "Ready") {
Write-Output [WARNING]: Scheduled task $scheduledTask_01 was already stopped. -ForegroundColor Yellow
}
else {
Stop-ScheduledTask -TaskName "$scheduledTask_01"
Write-Output [OK]: Running task $scheduledTask_01 stopped. -ForegroundColor Green
}
}
}
function Start-ScheduledTasks{
Write-Output [INFO]: Starting scheduled tasks... -ForegroundColor White
Get-ScheduledTask -TaskName "$scheduledTask_01" | ForEach {
if ($_.State -eq "Running") {
Write-Output [WARNING]: Scheduled task $scheduledTask_01 already started. -ForegroundColor Yellow
}
else {
Start-ScheduledTask -TaskName "$scheduledTask_01"
Write-Output [OK]: Stopped scheduled task $scheduledTask_01 started. -ForegroundColor Green
}
}
}
Use case:
User starts the deployment by clicning the deploy button in Azure DevOps UI
The task using the Worker script takes function from Library script (in this case stops Scheduled Task) and performs it
User checks log on the Azure DevOps side and sees the custom output lines from Library script. (2 of them now - 1. starting with [INFO], 2. either starting with [WARNING] or [OK]).
Could you please advice a solution how to achieve that? Thank you.
NOTE: Those examples are run in Azure DevOps (on premise) release pipelines and desired outcomes are ment for users running those pipelines.
If you're trying to write to the azure devops pipeline log, then you should avoid using Write-Output. That does something subtly different; it adds to the function's return value.
So for example the Write-Output in the function Stop-ScheduledTask; that is roughly equivalent to you putting at the end of the function:
return "[WARNING]: Scheduled task $scheduledTask_01 was already stopped."
That might end up being printed to the pipeline log, or it might not; and importantly, it might completely mess up a function which is genuinely trying to return a simple value.
Instead of using Write-Output, I recommend using Write-Host. What that does is immediately write a line to the pipeline log, without affecting what a library function will return.
Write-Output "[WARNING]: Scheduled task $scheduledTask_01 was already stopped."
You can also use Write-Warning and Write-Error.

How to prevent multiple instances of the same PowerShell 7 script?

Context
On a build server, a PowerShell 7 script script.ps1 will be started and will be running in the background in the remote computer.
What I want
A safenet to ensure that at most 1 instance of the script.ps1 script is running at once on the build server or remote computer, at all times.
What I tried:
I tried meddling with PowerShell 7 background jobs (by executing the script.ps1 as a job inside a wrapper script wrapper.ps1), however that didn't solve the problem as jobs do not carry over (and can't be accessed) in other PowerShell sessions.
What I tried looks like this:
# inside wrapper.ps1
$running_jobs = $(Get-Job -State Running) | Where-Object {$_.Name -eq "ImportantJob"}
if ($running_jobs.count -eq 0) {
Start-Job .\script.ps1 -Name "ImportantJob" -ArgumentList #($some_variables)
} else {
Write-Warning "Could not start new job; Existing job detected must be terminated beforehand."
}
To reiterate, the problem with that is that $running_jobs only returns the jobs running in the current session, so this code only limits one job per session, allowing for multiple instances to be ran if multiple sessions were mistakenly opened.
What I also tried:
I tried to look into Get-CimInstance:
$processes = Get-CimInstance -ClassName Win32_Process | Where-Object {$_.Name -eq "pwsh.exe"}
While this does return the current running PowerShell instances, these elements carry no information on the script that is being executed, as shown after I run:
foreach ($p in $processes) {
$p | Format-List *
}
I'm therefore lost and I feel like I'm missing something.
I appreciate any help or suggestions.
I like to define a config path in the $env:ProgramData location using a CompanyName\ProjectName scheme so I can put "per system" configuration.
You could use a similar scheme with a defined location to store a lock file created when the script run and deleted at the end of it (as suggested already within the comments).
Then, it is up to you to add additional checks if needed (What happen if the script exit prematurely while the lock is still present ?)
Example
# Define default path (Not user specific)
$ConfigLocation = "$Env:ProgramData\CompanyName\ProjectName"
# Create path if it does not exist
New-Item -ItemType Directory -Path $ConfigLocation -EA 0 | Out-Null
$LockFilePath = "$ConfigLocation\Instance.Lock"
$Locked = $null -ne (Get-Item -Path $LockFilePath -EA 0)
if ($Locked) {Exit}
# Lock
New-Item -Path $LockFilePath
# Do stuff
# Remove lock
Remove-Item -Path $LockFilePath
Alternatively, on Windows, you could also use a scheduled task without a schedule and with the setting "If the task is already running, then the following rule applies: Do not start a new instance". From there, instead of calling the original script, you call a proxy script that just launch the scheduled task.

Set-AzDataFactoryV2Trigger fails in Azure Powershell Task in Release pipeline but works fine on Powershell in frontend machine

I want to create all the triggers in ADF after the Release pipeline has been run successfully . This is because there is a hard 256 parameters limit for ARM template max no. of parameters.
The idea is we will delete all the triggers in DEV, TEST, QA and in PROD. In our published artifact, we would have all the JSONs trigger files using which we can create triggers. The Release pipeline would run a PowerShell script and create the Triggers using Set-AzDataFactoryV2Trigger.
I am able to run the below script correctly on my frontend -
$AllTriggers = Get-ChildItem -Path .
Write-Host $AllTriggers
$AllTriggers | ForEach-Object {
Set-AzDataFactoryV2Trigger -ResourceGroupName "<MyResourceGroupName>" -DataFactoryName "<MyTargetDataFactoryName>" -Name "$_" -DefinitionFile ".\$_.json"
}
In the Azure Powershell script, the first line has to be changed a little to read all the JSON's from the Published Artifact -
$AllTriggers = Get-ChildItem -Name -Path "./_TriggerCreations/drop/" -Include *.json
I receive the below error when trying to run this script via Az Powershell task in the release pipeline (you may note that the error is gibberish) -
The yellow blurred line is the name of the Trigger.
Stuck on this for some time now. Any help would be highly appreciated.
Regards,
Sree

Forcing a powershell script to the next line

I have a powershell script that at one point will call 2 other powershell scripts to run. It runs one script to completion, then the other, but this causes it to take longer. Can I force the script to execute the other scripts and continue cycling through? When I used to run these scripts manually I would have 20-30 sessions running and walk away while it worked. What I wrote took the monotony of clicking through them manually
Here's the parent script:
$List = Get-Content C:\archive\${env:id}.txt
$Batch = New-Object System.Collections.ArrayList
foreach ($Data in $List){
if ($Data -eq "" -or $data -eq $List[-1]){
$ProjectName = $Batch[0]
out-file C:\archive\"$ProjectName".txt
foreach($Data in $Batch -ne $Batch[0]){
Add-Content -Path C:\archive\"$ProjectName".txt -Exclude
$Batch[0] -Value $Data
}
--> C:\archive\GetPrograms.ps1 $ProjectName
--> C:\archive\GetNetwork.ps1 $ProjectName
$Batch = New-Object System.Collections.ArrayList
}
else{
[void]$Batch.Add($Data)
}
}
The parent script is not contingent on the data produced by the other 2 scripts. It simply executes them by passing in data
Honestly, based on your use case description, you really want to be looking at Parallel job/task/script processing.
Here is a post along the lines of what should satisfy your goals.
How do I run my PowerShell scripts in parallel without using Jobs?
Update - While this answer explains the process and mechanics of
PowerShell runspaces and how they can help you multi-thread
non-sequential workloads, fellow PowerShell aficionado Warren 'Cookie
Monster' F has gone the extra mile and incorporated these same
concepts into a single tool called Invoke-Parallel - it does what I
describe below, and he has since expanded it with optional switches
for logging and prepared session state including imported modules,
really cool stuff - I strongly recommend you check it out before
building you own shiny solution!
https://serverfault.com/questions/626711/how-do-i-run-my-powershell-scripts-in-parallel-without-using-jobs

new-webapplication command fails when a TFS Release definition runs at the same time in multiple environments, getting null exception

I am trying to run a release definition for multiple environments at the same time. as part of this definition, I run a powershell script executing the new-webapplication command with certain parameters. I have 9 different environments. As part of the steps I am running a powershell script that doesn't fail when I run the release definition in sequence, I mean, running each release environment if the previous one was successful, then I don't get any errors. But If I pretend to run the same release definition for all my environments at the same time. Then it fails.
These are just a small example of my environments for this release
This is the error I am getting
##[error]Object reference not set to an instance of an object.
+ CategoryInfo : NotSpecified: (:) [New-Item], NullReferenceException
+ FullyQualifiedErrorId : System.NullReferenceException,Microsoft.PowerShell.Commands.NewItemCommandBut this just happens when I run the script with release agents in parallel, not in sequence as you can see on this configuration where all them have this option **Automated:After release creation**
And this is the script code on powershell that runs in every environment as the powershell task
Param
(
[string]$remoteserver,
[string]$directory_path,
[string]$website_name,
[string]$app_n
)
$ScriptBlockContent =
{
$eventlog = $args[0]
$num = $args[1]
$app_name = $args[2]
$targetdir= $eventlog+'\'+$num+'\'+$app_name
write-host $targetdir
write-host 'THE PATH ' $targetdir ' NOT EXIST'
write-host 'CREATING ' $app_name' DIRECTORY'
new-item -itemtype directory -path $targetdir -force
write-host 'CREATING ' $app_name' APPLICATION'
new-webapplication -name $app_name -force -site $num -physicalpath $targetdir
}
Invoke-Command -Computer $remoteserver -ScriptBlock $ScriptBlockContent -ArgumentList $directory_path, $website_name, $app_n
After printing each command with the -Verbose parameter, I realize that the new-webapplication command is failing with this null reference exception.
Is the IIS not capable to manage the requests of this command at the same time from multiple threads?
Is there anyways to rewrite myscript?
while(iis-is-failing)
//re-run new-webapplication ...
First please double check your Queuing policies. Queuing policies are defined in the Options section of the Deployment conditions tab.
Maek sure you didn't select after successful deployment on another environment in the trigger and select allow multiple releases to be deployed at the same time. For the detail info of each settings please refer Deployment conditions
When you create a new environment, the default is checking the trigger.
Also pay attention to the license just like Daniel Mann mentioned in this question Release Management TFS 2015 - No Parallel Tasks