Recycle all apps pools in IIS by command line - powershell

I have a PowerShell script:
& $psexec $serveraddr -u $remoteuser -p $remotepass -accepteula C:\Windows\System32\inetsrv\appcmd.exe list apppool /xml | C:\Windows\System32\inetsrv\appcmd.exe recycle apppool /in
that I am using to recycle all IIS pools. The problem is that only default, given from IIS pools are recycled. No private pools are recycled. They are not found by the second appcmd. First appcmd finds all pools, given by IIS and private.
Error is:
ERROR ( message:Nie można odnaleźć obiektu APPPOOL o identyfikatorze "Core1". )
from polish language it is:
ERROR ( message: Can't find object APPPOOL with id "Core1". )
I can't recycle private pools. Is there a way to bypass this?

This is a one liner to recyle all Applications Pools :
& $env:windir\system32\inetsrv\appcmd list apppools /state:Started /xml | & $env:windir\system32\inetsrv\appcmd recycle apppools /in

So the second part of the command is executed locally. I've changed the script to recycle every each pool by single commands:
& $psexec $server -u $remoteuser -p $remotepass -accepteula C:\Windows\System32\inetsrv\appcmd.exe recycle apppool /apppool.name:Core1

This is an overkill for your question, but you might be interested in the general alternate approach to do something in parallel on the several servers:
$servers=#('server1', 'server2', 'server3')
$recycleAppPools = {
echo $(Get-Wmiobject -Class Win32_ComputerSystem).Name
appcmd list apppools /state:Started /xml | appcmd recycle apppools /in
echo "`n"
}
workflow Perform-Deployment {
Param ($servers, $actionBlock)
# Run on all servers in parallel
foreach -parallel ($server in $servers) {
"Doing on $server..."
# Execute script on the server
InlineScript {
$scriptBlock = [scriptblock]::Create($Using:actionBlock)
Invoke-Command -computername $Using:server -ScriptBlock $scriptBlock
}
}
}
cls
# Execute workflow
Perform-Deployment $servers $recycleAppPools
Moreover, you could pass parameters to your script block, like, for example:
$DeployPythonPackage = {
param($venv, $pythonPackagePath)
& "$venv\scripts\pip" install --upgrade $pythonPackagePath
}
workflow Perform-Deployment {
Param ($servers, $actionBlock, $venv, $pythonPackagePath)
# Run on all servers in parallel
foreach -parallel ($server in $servers) {
"Deploying Python package '$pythonPackagePath' on $server..."
# Execute script on the server
InlineScript {
$scriptBlock = [scriptblock]::Create($Using:actionBlock)
Invoke-Command -computername $Using:server -ScriptBlock $scriptBlock `
-ArgumentList $Using:venv, $Using:pythonPackagePath
}
}
}
cls
# Execute workflow
Perform-Deployment $servers $DeployPythonPackage $venv $pythonPackagePath

Related

Powershell nested for loop to restart services

My goal is to loop through all devices , stop a specific service for all of those devices ( in this case, IntenAudioService), then go kill speciifc tasks realted to that service ( let's just say task IntelX and task IntelY, if they exist)
Then just loop through again and re-start those services. can this be done all in 1 for loop? Is the syntax correct?
$devices= <<user can populate devices in this object. DeviceName or deviceID??>>
>Foreach ($device in $devices){
Invoke-Command -ComputerName $device {
net-stop IntelAudioService
taskkill /IM IntelX.exe /F
net start IntelAudioService
}
}
What if I wanted to also set a service for each device? Something like this?
foreach ($device in $devices){
Invoke-Command -ComputerName $device {
Set-Service -Name BITS -StartupType Automatic
}
}
Try with this, note that you can Invoke-Command to multiple hostnames at the same time. You can also create a New-PSession with multiple computers at the same time.
$ErrorActionPreference = 'Stop'
$devices = 'Hostname1','Hostname2'
$serviceName = 'IntelAudioService' # This can be an array
$processName = 'IntelX' # This can be an array
# Note: Looping through the devices an attempting to establish a
# PSSession like below is good if you're not sure if the remote host
# is up or if the device name is the right one, etc. Using a Try {} Catch {}
# statement in this case will let you know if you couldn't connect with a
# specific remote host and which one.
# You can also simply do: $session = New-PSSession $devices without
# any loop which will be a lot faster of course, however,
# if you fail to connect to one of the remote hosts
# you will get an error and the the PSSession cmdlet will stop.
$session = foreach($device in $devices)
{
try
{
New-PSSession $device
}
catch
{
Write-Warning $_
}
}
Invoke-Command -Session $session -ScriptBlock {
Get-Service $using:serviceName | Stop-Service -Force -Verbose
Get-Process $using:processName | Stop-Process -Force -Verbose
Start-Service $using:serviceName -Verbose
# Set-Service -Name $using:serviceName -StartupType Automatic
}
Remove-PSSession $session

Calling other PowerShell scripts within a PowerShell script

I'm trying to get one master PowerShell script to run all of the others while waiting 30-60 seconds to ensure that the tasks are completed. Everything else I tried wouldn't stop/wait for the first script and its processes to complete before going through all the others at the same time and would cause a restart automatically.
Main script, run as admin:
$LogStart = 'Log '
$LogDate = Get-Date -Format "dd-MM-yyy-hh-mm-ss"
$FileName = $LogStart + $LogDate + '.txt.'
$scriptList = #(
'C:\Scripts\1-OneDriveUninstall.ps1'
'C:\Scripts\2-ComputerRename.ps1'
);
Start-Transcript -Path "C:\Scripts\$FileName"
foreach ($script in $scriptList) {
Start-Process -FilePath "$PSHOME\powershell.exe" -ArgumentList "-Command '& $script'"
Write-Output "The $script is running."
Start-Sleep -Seconds 30
}
Write-Output "Scripts have completed. Computer will restart in 10 seconds."
Start-Sleep -Seconds 10
Stop-Transcript
C:\Scripts\3-Restart.ps1
1-OneDriveUninstall.ps1:
Set-ItemProperty -Path REGISTRY::HKEY_LOCAL_MACHINE\Software\Microsoft\windows\CurrentVersion\Policies\System -Name ConsentPromptBehaviorAdmin -Value 0
taskkill /f /im OneDrive.exe
C:\Windows\SysWOW64\OneDriveSetup.exe /uninstall
2-ComputerRename.ps1:
$computername = Get-Content env:computername
$servicetag = Get-WmiObject Win32_Bios |
Select-Object -ExpandProperty SerialNumber
if ($computername -ne $servicetag) {
Write-Host "Renaming computer to $servicetag..."
Rename-Computer -NewName $servicetag
} else {
Write-Host "Computer name is already set to service tag."
}
The log file shows:
Transcript started, output file is C:\Scripts\Log 13-09-2019-04-28-47.txt.
The C:\Scripts\1-OneDriveUninstall.ps1 is running.
The C:\Scripts\2-ComputerRename.ps1 is running.
Scripts have completed. Computer will restart in 10 seconds.
Windows PowerShell transcript end
End time: 20190913162957
They aren't running correctly at all though. They run fine individually but not when put into one master script.
PowerShell can run PowerShell scripts from other PowerShell scripts directly. The only time you need Start-Process for that is when you want to run the called script with elevated privileges (which isn't necessary here, since your parent script is already running elevated).
This should suffice:
foreach ($script in $scriptList) {
& $script
}
The above code will run the scripts sequentially (i.e. start the next script only after the previous one terminated). If you want to run the scripts in parallel, the canonical way is to use background jobs:
$jobs = foreach ($script in $scriptList) {
Start-Job -ScriptBlock { & $using:script }
}
$jobs | Wait-Job | Receive-Job

Powershell remote script error Error during CryptAcquireContext

I just run a simple script with invoke-command -computer [servername] -scriptblock {powershell.exe D:\test\script.ps1}
If I run the script manually in the box and then run the remote script again the error does not appear anymore but I don't like having to login manually to the box and run the script to be able fix this error especially with so many servers. Can anyone help me on this. Thanks
Error during CryptAcquireContext. [servername] :
Error msg : The requested operation cannot be completed. The computer must be trusted for delegation and the current user account must be configured to allow delegation.
Error code : 80090345
The script running on the server that gets the error part
$fciv = "D:\test\fciv.exe"
$fcivLog = "D:\test\fcivLog.txt"
$xmlPath = "D:\test\server.xml"
& $fciv -v -bp "\\servername\folder1" -XML $xmlPath | Out-File $fcivLog
Here is a PowerShell function, that should work on PowerShell version 2.0, to calculate MD5 hashes:
function Get-MD5FileHash {
[CmdletBinding()]
param (
[string] $Path
)
$MD5 = [System.Security.Cryptography.MD5]::Create();
$Stream = [System.IO.File]::OpenRead($Path);
$ByteArray = $MD5.ComputeHash($Stream);
[System.BitConverter]::ToString($ByteArray).Replace('-','').ToLower();
$Stream.Dispose();
}
Get-MD5FileHash -Path C:\test\test.xlsx;
I tested it out on PowerShell 4.0 on Windows 8.1, and it works great!
This question is quite old, and a work around has been found. But it still does not resolve the primary issue of delegation for programs using CryptAcquireContext
I had the very same problem with another program (BiosConfigUtility, from HP).
I solved it by allowing delegation between my computer, and remote computers.
To enable delegation on your client :
Enable-WSManCredSSP -Role Client -DelegateComputer host.domain.com -Force
To enable delegation on the remote computer :
Enable-WSManCredSSP -Role Server –Force
See this post : https://devblogs.microsoft.com/scripting/enable-powershell-second-hop-functionality-with-credssp/ for more info
You can always use scheduled tasks instead. This script changes the bios from legacy to uefi boot using biosconfigutility64 (or erase setup password for surplusing). Remotely running it directly will give that cryptacquirecontext error.
# usage: .\hpuefi.ps1 comp1,comp2,comp3
$s = new-pssession $args[0]
$src = 'Y:\hp-bios-new'
$dst = 'c:\users\admin\documents\hp-bios-new'
icm $s { if (! (test-path $using:dst)) { mkdir $using:dst > $null } }
$s | % { copy $src\biosconfigutility64.exe,$src\pass.bin,$src\uefi.bat,$src\uefi.txt $dst -tosession $_ }
icm $s {
# ERROR: Error during CryptAcquireContext. LastError = 0x80090345
# & $using:dst\uefi.bat
# 2>&1 must go last
$action = New-ScheduledTaskAction -Execute 'cmd' -argument '/c c:\users\admin\documents\hp-bios-new\uefi.bat > c:\users\admin\documents\hp-bios-new\uefi.log 2>&1'
Register-ScheduledTask -action $action -taskname uefi -user system > $null
Start-ScheduledTask -TaskName uefi
# wait
while ((Get-ScheduledTask -TaskName uefi).State -ne 'Ready') {
Write-Verbose -Message 'Waiting on scheduled task...' }
Get-ScheduledTask uefi | Get-ScheduledTaskInfo | ft
# remove-scheduledtask uefi
# shutdown /r /t 0
}
uefi.bat:
%~dp0BiosConfigUtility64.exe /set:"%~dp0uefi.txt" /cspwdfile:"%~dp0pass.bin"
exit /b %errorlevel%

PowerShell script run from TaskScheduler yielding empty array of running VM's

I can run my powershell script as administrator in powershell, and it yields good list of running VM's. But when I run it in TaskScheduler with highest privileges, it's showing an empty list of running VM's. We have Server 2008 R2, PowerShell V3, and I downloaded the Hyper-V module for powershell recently. I created an account on the server with Administrators privileges, and Administrators have full control for all directories that the script is copying files from/to.
Also, when I run the script through powershell, I needed to run as administrator. When I run it with the powershell prompt this is what it looks like:
C:\windows\system32> powershell -NoProfile -noninteractive -ExecutionPolicy bypass -Command "& c:\Scripts\BackupVhdShell_2_param.ps1 -single_backup_file_to_loc 'E:\' -single_backup_file_from_loc 'S:\SQL-bak.vhd'"
So that works from powreshell to start/stop vm's and copy files.
In Task Scheduler, this is how I have it set up and it yields the empty list of running VM's:
Run with highest privileges is checked. I have my login credentials saved so it can wake up the server when I'm not here or if it's not up.
In The Program/script field: %SystemRoot%\SysWow64\WindowsPowerShell\v1.0\powershell.exe
In the Add Arguments field: -NoProfile -noninteractive -ExecutionPolicy bypass -Command "& c:\Scripts\BackupVhdShell_2_param.ps1 -single_backup_file_to_loc 'E:\' -single_backup_file_from_loc 'S:\SQL-bak.vhd'"
Any thoughts? I'm not sure if TaskManager isn't finding HyperV module? Or maybe I need Runas to get it to be administrator? I'm having trouble finding info on that. This link was similar but different: http://ss64.com/nt/runas.html Same thing as this: http://peter.hahndorf.eu/blog/
This is what the majority of the script looks like. Note that I have since added logging to the file and know that this line is coming up empty when the script is run through TaskScheduler: <[array]$vmNames = #(Get-VM -Running | %{$_.elementname})>
Again, it works fine through powershell.
The script:
param($single_backup_file_to_loc, $single_backup_file_from_loc)
function StopVMsInOrder ([array][String]$vmNames){
#this function will stop VM's in list, sequentially
Write-Host "Processing virtual machines in order"
foreach ($name in $vmNames) {
Write-Host "Analyzing $name"
Try {
#Write-Host "...Saving $name"
#Save-VM -VM $name -wait -Force
Write-Host "..shutdown $name" #name)"
Invoke-VMShutdown -VM $name -Force #$vm.name
} #try
Catch {
Write-Host "Failed to get virtual machine $name"
} #catch
}#foreach
} #function StopVMsInOrder
function StartVMsInOrder ([array][String]$vmNames){
#this function will start VM's in list, sequentially as opposed to all at once
Write-Host "Processing virtual machines in order"
foreach ($name in $vmNames) {
Write-Host "Analyzing $name"
Try {
Write-Host "..Starting $name"
Start-VM -VM $name -wait
} #try
Catch {
Write-Host "Failed to get virtual machine $name"
} #catch
}#foreach
} #function StartVMsInOrder
function CopyFileToFolder ([string]$Source,[string]$destination){
# get filename
...
}
#################start of script##############
import-module Hyperv
#get list of running vm's
[array]$vmNames = #(Get-VM -Running | %{$_.elementname})
Write-Host "To: $single_backup_file_to_loc"
Write-Host "From: $single_backup_file_from_loc"
#call function to stop vm's
StopVMsInOrder $vmNames
if($single_backup_file_to_loc -ne " ")
{
#someone passed in a parameter for one-off use of script
[array]$destFileArray = #($single_backup_file_to_loc)
[array]$sourceFileArray = #($single_backup_file_from_loc)
}else
{
Write-Host "To Loc not Defined as param"
#get set up for what need to backup vhd's
#where back it up to
}
$i=0
for ($i = $sourceFileArray.GetLowerBound(0); $i -le $sourceFileArray.GetUpperBound(0); $i++) {
$tempSource = $sourceFileArray[$i]
$tempDest = $destFileArray[$i]
CopyFileToFolder $tempSource $tempDest
Write-Host "i: $i"
}
Write-Host "Done with vhd backup"
#call function to start vm's
StartVMsInOrder $vmNames
Write-Host "Done with vm start"
I finally figured it out! I changed it so I was using the other version of powershell in TaskScheduler: %SystemRoot%\system32.... Now it's finding the VM's.

PSExec never completes when run inside start-job

I'm trying to execute a cmd file on a list of 48 computers. I don't want to execute and wait for completion sequentially because each cmd takes about 10 minutes to complete. WinRM isn't an option. Neither is WMI. PSExec is an option....but I can't seem to make it work inside of Start-Job.
I'm doing something like:
$sb = {
param
(
$computer = "serverw01",
$userid = "domain2\serviceid",
$password = 'servicepw',
$command = "cd /d d:\ && updateAll.cmd"
)
d:\eps\pstools\PsExec.exe -u $userid -p $password "\\$($computer)" cmd /c $command
}
foreach ($computer in Get-Content "D:\Data\serverlist.txt") {
Start-Job $sb -ArgumentList $computer
}
This creates a bunch of jobs....but the never complete and if I Receive-Job on any of them i get back
PS> get-job | receive-job -Keep
+ CategoryInfo : NotSpecified: (:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
PsExec v1.98 - Execute processes remotely
Copyright (C) 2001-2010 Mark Russinovich
Sysinternals - www.sysinternals.com
it executes just fine if I run the function like:
& $sb -computer "serverw01"
Initiating script is run in Powershell v2.0 on Server 2008r2 box
I've tried it on a box in domain2 while logged in with a domain admin userid (same result).
Try this for the psexec command, ensuring you include "-d" to not wait for response, and put the computer variable right after psexec:
d:\eps\pstools\psexec "\\$($computer)" /accepteula -u $userid -p $password -d cmd /c $command
This hanging issue occurs on Win2003 and Win2008 servers.
Most people solve this issue with a workaround like echoing and piping so that powershell gets some input from STDIN.
But there exists a solution within powershell. Just start powershell with the option -inputformat none like:
powershell -inputformat none -command ...
please try the -accepteula parameter to psexec
like
d:\eps\pstools\PsExec.exe -accepteula -u $userid -p $password
from
$computerList = Get-Content "D:\Data\serverlist.txt"
$sb =
{
param($name)
}
$computer = $name
$userid = "domain2\serviceid"
$password = 'servicepw'
$command = "cd /d d:\ && updateAll.cmd"
d:\eps\pstools\PsExec.exe -u $userid -p $password \\$computer cmd /c $command
{
}
foreach ($computer in $computerLinst) {
Start-Job $sb -ArgumentList $computer
}