PowerShell ProccesorAffinity CPU Designation - powershell

I am trying to set a script that, when a service is restarted, the script resets the processor affinity to the settings I want.
The code I have used for other projects has worked in the past, but is now failing.
$Process = Get-Process -Name 'SpaceEngineersDedicated'
$Process.ProcessorAffinity = 254
$Process = Get-Process -Name 'SpaceEngineersDedicated'
$Process.ProcessorAffinity = 255
If I had to guess, this is because this is the first time I have tried to set up such a script on a server with two CPUs. (254,255 was for a computer with only one CPU) The server has 16 cores/threads total.
The goal of this script is to force the service to use all cores, as it only uses one core/thread (Core 0, Node 0) originally. I can do this manually from Task Manager, so I am not sure why it fails.
The error the code spits out says that the property ProcessorAffinity cannot be found on this object.

Your Get-Process call is returning multiple processes. In the below syntax, we force these to come back as an array of processes and loop over them to set the property:
#(Get-Process -Name 'SpaceEngineersDedicated') |
ForEach-Object { $_.ProcessorAffinity = 255 }
You cannot utilize Member Enumeration to set properties if more than one is returned:
## This doesn't work unless .Count = 1
#(Get-Process -Name 'SpaceEngineersDedicated').ProcessorAffinity = 255

Related

How to prevent multiple instances of the same PowerShell 7 script?

Context
On a build server, a PowerShell 7 script script.ps1 will be started and will be running in the background in the remote computer.
What I want
A safenet to ensure that at most 1 instance of the script.ps1 script is running at once on the build server or remote computer, at all times.
What I tried:
I tried meddling with PowerShell 7 background jobs (by executing the script.ps1 as a job inside a wrapper script wrapper.ps1), however that didn't solve the problem as jobs do not carry over (and can't be accessed) in other PowerShell sessions.
What I tried looks like this:
# inside wrapper.ps1
$running_jobs = $(Get-Job -State Running) | Where-Object {$_.Name -eq "ImportantJob"}
if ($running_jobs.count -eq 0) {
Start-Job .\script.ps1 -Name "ImportantJob" -ArgumentList #($some_variables)
} else {
Write-Warning "Could not start new job; Existing job detected must be terminated beforehand."
}
To reiterate, the problem with that is that $running_jobs only returns the jobs running in the current session, so this code only limits one job per session, allowing for multiple instances to be ran if multiple sessions were mistakenly opened.
What I also tried:
I tried to look into Get-CimInstance:
$processes = Get-CimInstance -ClassName Win32_Process | Where-Object {$_.Name -eq "pwsh.exe"}
While this does return the current running PowerShell instances, these elements carry no information on the script that is being executed, as shown after I run:
foreach ($p in $processes) {
$p | Format-List *
}
I'm therefore lost and I feel like I'm missing something.
I appreciate any help or suggestions.
I like to define a config path in the $env:ProgramData location using a CompanyName\ProjectName scheme so I can put "per system" configuration.
You could use a similar scheme with a defined location to store a lock file created when the script run and deleted at the end of it (as suggested already within the comments).
Then, it is up to you to add additional checks if needed (What happen if the script exit prematurely while the lock is still present ?)
Example
# Define default path (Not user specific)
$ConfigLocation = "$Env:ProgramData\CompanyName\ProjectName"
# Create path if it does not exist
New-Item -ItemType Directory -Path $ConfigLocation -EA 0 | Out-Null
$LockFilePath = "$ConfigLocation\Instance.Lock"
$Locked = $null -ne (Get-Item -Path $LockFilePath -EA 0)
if ($Locked) {Exit}
# Lock
New-Item -Path $LockFilePath
# Do stuff
# Remove lock
Remove-Item -Path $LockFilePath
Alternatively, on Windows, you could also use a scheduled task without a schedule and with the setting "If the task is already running, then the following rule applies: Do not start a new instance". From there, instead of calling the original script, you call a proxy script that just launch the scheduled task.

Getting specific app pool's worker process in PowerShell returns value but process is already stopped

I have multiple websites - each on a separate app pool.
The app pool I'm referring to has 1 worker process.
After stopping the app pool, I'm trying to wait and verify that the worker process has stopped.
$appPoolName = $appPool.name;
Write-Host "appPoolName: $appPoolName";
$w3wp = Get-ChildItem "IIS:\AppPools\$appPoolName\WorkerProcesses\";
while($w3wp -and $retrys -gt 0)
{
Write-Host "w3wp value is: $w3wp";
Start-Sleep -s 10;
$retrys--;
$w3wp = Get-ChildItem "IIS:\AppPools\$appPoolName\WorkerProcesses\";
Write-Host "w3wp value(2) is: $w3wp";
if(-not $w3wp)
{
break;
}
}
The print of both values is always "Microsoft.IIs.PowerShell.Framework.ConfigurationElement", even when I see the process is stopped and no longer in Task Manager.
Also strange: When I open another PowerShell session while the code runs and call
$w3wp = Get-ChildItem "IIS:\AppPools\$appPoolName\WorkerProcesses\";
w3wp has no value (because it is no longer exist).
Any ideas why the value isn't changing?
Or maybe how to do that differently?
Thanks in advance :)
I think the IIS: provider is caching data. I dont know of a fix, but heres a couple of alternatives:
use WMI from powershell:
gwmi -NS 'root\WebAdministration' -class 'WorkerProcess' | select AppPoolName,ProcessId
Run appcmd
appcmd list wp

Stop a process running longer than an hour

I posted a question a couple ago, I needed a powershell script that would start a service if it was stopped, stop the process if running longer than an hour then start it again, and if running less than an hour do nothing. I was given a great script that really helped, but I'm trying to convert it to a "process". I have the following code (below) but am getting the following error
Error
"cmdlet Start-Process at command pipeline position 3
Supply values for the following parameters:
FilePath: "
Powershell
# for debugging
$PSDefaultParameterValues['*Process:Verbose'] = $true
$str = Get-Process -Name "Chrome"
if ($str.Status -eq 'stopped') {
$str | Start-Process
} elseif ($str.StartTime -lt (Get-Date).AddHours(-1)) {
$str | Stop-Process -PassThru | Start-Process
} else {
'Chrome is running and StartTime is within the past hour!'
}
# other logic goes here
Your $str is storing a list of all processes with the name "Chrome", so I imagine you want a single process. You'll need to specify an ID in Get-Process or use $str[0] to single out a specific process in the list.
When you store a single process in $str, if you try to print your $str.Status, you'll see that it would output nothing, because Status isn't a property of a process. A process is either running or it doesn't exist. That said, you may want to have your logic instead check if it can find the process and then start the process if it can't, in which case it needs the path to the executable to start the process. More info with examples can be found here: https://technet.microsoft.com/en-us/library/41a7e43c-9bb3-4dc2-8b0c-f6c32962e72c?f=255&MSPPError=-2147217396
If you're using Powershell ISE, try storing the process in a variable in the terminal, type the variable with a dot afterwards, and Intellisense (if it's on) should give a list of all its available properties.

How to run two powershell functions at the same time in the same script?

I've done a lot of research on async calls within powershell and I haven't gotten anything to work correctly.
I need to find a way to have two separate database queries run at the same time. I have one onpremise connection and one online connection through crm dynamics.
function Get-OnPremiseDBQuery {
$onPremResultsLocation = "..\results\onPremRecordsCountByEntity.txt"
$onPremRecordSummary = #()
foreach ($entityGroup in $entitiesArray) {
$onPremRecordSummaryByGroup = Get-OnPremiseRecordsCountSummary -onPremConnection $onPremCRMConnection -entityNames $entityGroup
$onPremRecordSummary += $onPremRecordSummaryByGroup
Write-Host "ON PREM"
Write-Host $onPremRecordSummaryByGroup | Format-Table -Property * -AutoSize | Out-String -Width 4096
}
Write-Output $onPremRecordSummary | Format-Table -Property * -AutoSize | Out-String -Width 4096 | Out-File $onPremResultsLocation -Append
}
function Get-OnlineDBQuery {
$onlineResultsLocation = "..\results\onlineRecordsCountByEntity.txt"
$onlineRecordSummary = #()
foreach ($entityGroup in $entitiesArray) {
$onlineRecordSummaryByGroup = Get-OnLineRecordsCountSummary -onlineConnection $onlineCRMConnection -entityNames $entityGroup
$onlineRecordSummary += $onlineRecordSummaryByGroup
Write-Host "ONLINE"
Write-Host $onlineRecordSummaryByGroup | Format-Table -Property * -AutoSize | Out-String -Width 4096
}
Write-Output $onlineRecordSummary | Format-Table -Property * -AutoSize | Out-String -Width 4096 | Out-File $onlineResultsLocation -Append
}
#Async Query Calls to onpremis/online DBs
##Looking to find a way to run
Get-OnPremiseDBQuery
and
Get-OnlineDBQuery
AT THE SAME TIME
Please help if anyone can!
This sort of effort is what PowerShell Jobs and RunSpaces are designed for.
As for...
I've done a lot of research on async calls within powershell and I
haven't gotten anything to work correctly.
Are you saying, you have leverage the below, and none of the options worked for you?
You are also not defining what you mean by..
I haven't gotten anything to work correctly.
What's not working?
What response / errors are you getting?
The discussions:
How do I run my PowerShell scripts in parallel without using Jobs?
If I have a script that I need to run against multiple computers, or
with multiple different arguments, how can I execute it in parallel,
without having to incur the overhead of spawning a new PSJob with
Start-Job?
https://serverfault.com/questions/626711/how-do-i-run-my-powershell-scripts-in-parallel-without-using-jobs
Parallel processing with PowerShell
Working in parallel
Whichever approach you end up taking you will be getting PowerShell to
run tasks in parallel. That will often require you to have additional
instances of PowerShell running. The resources on your admin machine –
CPU, memory and network bandwidth – are finite. Keep those in mind so
you don’t overload the machine and end up getting nothing back.
https://blogs.technet.microsoft.com/uktechnet/2016/06/20/parallel-processing-with-powershell
See also:
Invoke-Async - Allows you to run any cmdlet/function/scriptblock
asynchronously
This has been tested in V2 and V3. You just provide the data set
(-set) such as a list of servers of configuration settings etc. The
param that the set belongs to (-setparam) such as ComputerName. You
have the ability to provide any other params with a hash via -params
(see examples.) The number of threads or jobs (concurrent instances)
can be controlled via ThreadCount
https://gallery.technet.microsoft.com/scriptcenter/Invoke-Async-Allows-you-to-83b0c9f0

How to get CPU usage & Memory consumed by particular process in powershell script

I want to find performance of single process, as example "SqlServer"
Which commands I should write to find out 2 things:
RAM utilized by SqlServer
CPU utilized by SqlServer
I found lot of solutions listing all processes, but I want to get only 1 i.e. SqlServer.
The command to get SQL server process information:
Get-Process SQLSERVR
The command to get information for any process that starts with S:
Get-Process S*
To get the amount of virtual memory that the SQLServer process is using:
Get-Process SQLSERVR | Select-Object VM
To get the size of the working set of the process, in kilobytes:
Get-Process SQLSERVR | Select-Object WS
To get the amount of pageable memory that the process is using, in kilobytes:
Get-Process SQLSERVR - Select-Object PM
To get the amount of non-pageable memory that the process is using, in kilobytes:
Get-Process SQLSERVR - Select-Object NPM
To get CPU (The amount of processor time that the process has used on all processors, in seconds):
Get-process SQLSERVR | Select-Object CPU
To better understand the Get-Process cmdlet, check out the documentation on technet here.
About the CPU I got this working the following way:
# To get the PID of the process (this will give you the first occurrance if multiple matches)
$proc_pid = (get-process "slack").Id[0]
# To match the CPU usage to for example Process Explorer you need to divide by the number of cores
$cpu_cores = (Get-WMIObject Win32_ComputerSystem).NumberOfLogicalProcessors
# This is to find the exact counter path, as you might have multiple processes with the same name
$proc_path = ((Get-Counter "\Process(*)\ID Process").CounterSamples | ? {$_.RawValue -eq $proc_pid}).Path
# We now get the CPU percentage
$prod_percentage_cpu = [Math]::Round(((Get-Counter ($proc_path -replace "\\id process$","\% Processor Time")).CounterSamples.CookedValue) / $cpu_cores)