I am using below powershell script to delete SharePoint Alerts.
foreach ($site in Get-SPSite -Limit All)
{
"Site Collection $site"
foreach ($web in $site.allwebs)
{
" Web $web"
$c = $web.alerts.count
" Deleting $c alerts"
for ($i=$c-1;$i -ge 0; $i--) { $web.alerts.delete($i) }
}
}
There are around million alerts in each of Dev, Test and UAT environments. It takes many hours to delete all the alerts at one go and as the servers automatically get restarted periodically, the script doesn't get executed fully.
I am aware that to resume PowerShell scripts after reboot we can use PowerShell Workflow with Checkpoint-workflow but not sure where to place checkpoints and PSPersist.
Need help to resume deleting of alerts in the above script after system reboot.
Update: After trying to implement it, I realized that SharePoint PowerShell cmdlets cannot be coupled with PowerShell Workflow. It doesn't allow
Add-PSSnapin "Microsoft.SharePoint.PowerShell"
to be added to workflows
Workflow SPAlerts
{
//Below tweaks didn't work
InlineScript{Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
Invoke-Command -ScriptBlock {Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
Invoke-Expression "......"
}
The MSDN documentation states:
You can place checkpoints anywhere in a workflow, including before and after each command or expression, or after each activity in a workflow. Windows PowerShell Workflow allows you to do so.
... Add a checkpoint after each significant part of the workflow completes; a part that you do not want to repeat if the workflow is interrupted.
... When the workflow definitions is complete, use the testing process to refine your checkpoints. Add and remove (comment out) checkpoints to achieve both a reasonable running time and a reasonable recovery time.
Looking at your code, I would place the checkpoint right after "Site Collection $site" if deleting alerts for the given website takes reasonable time on average. If there are just few sites each of them containing tons of alerts then I would place it on the start of the next foreach.
I would definitely not place it inside the worker foreach which deletes alerts.
I would also suggest you look at the foreach -Parallel capability of workflows to make the deleting parallel. That way even the last site/website should get it's turn, even if the server is restarted often.
Related
I have a small CI/CD script which is written in PowerShell, but I don't know how to stop it, if in the script I get unexpected error running Liquibase. All scripts in SQL and work (preconditions in place where I need to add), but I want to have more opportunity to control CI/CD script. Now, if the script gets the exception, it continues execution. The script updates some schemes and some of them has an influence on each other, so order is important.
#update first scheme - ETL (tables, global temp tables, packages)
.\liquibase update --defaults-file=import.properties
#how to stop this script, if I get unexpected error running Liquibase?
#update second scheme - data (only tables and roles for data)
.\liquibase update --defaults-file=data.properties
#update third scheme - views, tables and other for export data
.\liquibase update --defaults-file=export.properties
Have you tried this?
$result = Start-Process -filepath 'path to liquibase' -ArgumentList "your liquibase arguments go here" -wait
if($result.ExitCode -ne 0){
write-host 'something went wrong'
}
I'm trying to learn Workflows to process some long running steps. I have to use Powershell 5.1 for various reasons. I really like what I've learnt so far, Workflows can definitely reduce the amount of PS code you have to create. One thing that I'm really stuck on is the retry parameter :
InlineScript {
Write-Output "test"
0/0
} -PSComputerName $computer -PSConfigurationName "testuser" -PSActionRetryCount 3 -PSActionRetryIntervalSec 5
Without writing some additional code or using jobs. Can anybody give me some pointers as to why this is failing. I'm trying to force an error but the Workflow just seems to hang and then not retry for a 2nd time.
Thanks!
I have a PowerShell script that connects to a database and pulls a list of user data. I take this data and create a foreach loop to run a script for the data.
This is working but its slow as the results could be 1000+ entries, and it has to complete the Script.bat for User A before it can start User B. The Script.bat for a single user is independent from another and takes ~30s for each user.
Is there a way to speed this up at all? I've been playing with -Parallel, ForEach-Object and workflow but I can't get it to work, likely due to me being a noob in PS.
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
cmd.exe /c "`"$PSScriptRoot`"\bin\Script.bat -c `" -Switch $UserID`" >> `"$PSScriptRoot`"\${FileName3}_REST_${DateTime}.txt 2> nul";
}
You said it yourself, your bottleneck is with the batch file in your script, not the loop itself. foreach (as opposed to ForEach-Object) is already the faster foreach loop mechanism in PowerShell. Investigate your batch file to find out why it takes 30 seconds to complete, and optimize it where you can.
Using Jobs
Note: Start-Job will run the job under another process. If you have PowerShell Core you can make use of the Start-ThreadJob cmdlet in lieu of Start-Job. This will start your job as part of another thread of the same process instead of starting another process.
If you can't optimize your batch script or optimize it to meet your needs, then you can consider using Start-Job to kick off the job to execute asynchronously, and then check the result and get any output from it using Receive-Job. For example:
# Master list of jobs you need to check the result of later
$jobs = New-Object System.Collections.Generic.List[System.Management.Automation.Job]
# Run your script for each row
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
# Use Start-Job here to kick off the script and store the job information
# for later retrieval.
# The $using: scope modifier allows you to make use of variables that were
# defined in the session calling Start-Job
$job = Start-Job -ScriptBlock { cmd.exe /c "`"${using:PSScriptRoot}`"\bin\Script.bat -c `" -Switch ${using:UserID}`" >> `"${using:PSScriptRoot}`"\${using:FileName3}_REST_${DateTime}.txt 2> nul"; }
# Add the execution to the $jobs list to check the result of later
# Casting to void here prevents the Add method from returning the object
# we've added.
[void]$jobs.Add($job)
}
# Wait for the jobs to be done
Write-Host 'Waiting for all jobs to complete...'
while( $jobs | Where-Object { $_.State -eq 'Running' } ){
Start-Sleep -s 10
}
# Retrieve the output of the jobs
foreach( $j in $jobs ) {
Receive-Job $j
}
Note: Since you have ~1000 times you need to execute this script, you may want to consider writing your logic to only run a certain number of jobs at a time. My example above starts all necessary jobs without regarding the number that may execute at once.
For more information about jobs and the properties you can inspect on a running/completed job, check the links below:
About Jobs
Job Class
Using Scope*
* The documentation states that the using scope can only be declared when working with remote sessions, but this seems to work fine with Start-Job even if the job is local.
So this may be an odd request and maybe I'm going about this all wrong but I also have a unique situation. I have servers that are sometimes cloned and I need to run a script that I created on the clones servers. Due to the nature of the clones they cannot be connected to a network.
Currently I am manually putting the generic script on each server before cloning and then running the script on the clone server.
What I would like to do is have a script that runs and gathers all the information, say installed programs as an example, and generate a custom version of my current script on the servers before they are cloned.
I have both the powershell script that gets the server information and the generic one that makes the changes to the clone but I have not found a way to merge the two or any documentation so I don't know if i am hitting a limitation with this one.
Edit for more explanation and examples. I'm doing this from my phone atm so I dont have an example I can post.
Current I have a script that has a set number of applications to uninstall, registry keys to remove, services to stop ect. In another application I have a list of all the software that we have for each server and I can pull that data for each server. What I need to do is pull the data for each server, and have a script placed on each server that will uninstall just the programs for that server.
Currently the script has to run through every potential software and try to uninstall it and then check the other application to see if there are any additional programs that need to be uninstalled.
Hope this extra info helps.
Thanks.
Stop thinking of it as code.
Use script 1 to export blocks of text into a new file. for example, you might have a configuration that says all Dell servers must have this line of code run:
Set-DELL -attribute1 unmanaged
where on HP, the script would have been
Set-HP -attribute1 unmanaged
on web servers, you want:
set-web -active yes
where if not a web server, you want nothing.. so, your parent script code would look like:
$Dell = "Set-DELL -attribute1 unmanaged"
$HP = "Set-HP -attribute1 unmanaged"
$web = "set-web -active yes"
if (Get-servermake -eq "Dell")
{
$dell | out-file Child.ps1 -append
}
if (Get-servermake -eq "HP")
{
$HP | out-file Child.ps1 -append
}
if (Get-webserver -eq $true)
{
$web | out-file Child.ps1 -append
}
The result is a customized script for the specific server, child.ps1.
Now, you can take this and run with it. You could say add functionality to the child script like "Is it an AD controller", etc.
However, you might be better off having all of this in a single script, and just block off sections that don't apply in an if statement for example.
I'm still not totally sure I understand what your asking. If I've missed the mark, tell me how, and I'll tell you how to tweak this better. (And hopefully obvious is that the Get-whatever is sample code. I don't expect that to be what your using to determine a computer make/model/etc)
I need to first create and then copy some hundreds of folders & files via powershell (first create them on a local store and then copy them to a remote store).
However, when my foreach loop runs, every 40 or so write attempt fails due to "another process" which blocks the file/folder.
I currently fixed the issue using a simple sleep between every file creation (100ms). However, I wonder if there is no better way to do this? Especially when copying multiple files the sleep would depend on the network latency and dosn't seem to be a good solution to me.
Is there a way to "wait" till the write-operation of a file completed before starting another operation? Or to check if a file is still blocked by one process and wait till it's free again?
Have you tried running your code as a job? Example:
foreach ($file in $files) {
$job = Start-Job -ScriptBlock {
#operation here..
} | Wait-Job
#Log result of job using ex. $job and: '$job | Receive-Job' to get output
}
You could also extend it to create multiple jobs, and then use Get-Job | Wait-Job to wait for the all to finish before you proceed.