How do Powershell scheduled job InstanceIds work? - powershell

I am trying to manage instances of scheduled jobs using the Job cmdlets such as Get-Job and Wait-Job. I am having problems and have the feeling that I am missing something fundamental.
I have found that I should be importing the PSScheduledJob module if I want to work with scheduled job instances using the Job cmdlets.
NOTE: To use the Job cmdlets on instances of scheduled jobs, the PSScheduledJob module must be imported into the session. To import the PSScheduledJob module, type "Import-Module PSScheduledJob" (without quotation marks) or use any Scheduled Job cmdlet, such as Get-ScheduledJob.
https://technet.microsoft.com/en-us/library/hh850536.aspx
I also found that despite what I expected, the Id property on a job instance does not identify it globally - but rather within a single Powershell session.
The ID is an integer that uniquely identifies the job within the current session. It is easier to remember and to type than the instance ID, but it is unique only within the current session.
https://technet.microsoft.com/en-us/library/hh849693.aspx
The InstanceId is supposed to be globally unique which sounds good...
An instance ID is a GUID that uniquely identifies the job on the
computer.
https://technet.microsoft.com/en-us/library/hh849693.aspx
...but I seem to be getting the same job instance returning different InstanceIds. The following script gets the InstanceIds for jobs both directly and via remoting.
Import-Module PSScheduledJob;
$localInstanceIds = ( get-job -Name $jobName).InstanceId.Guid
$remoteInstanceIds = (invoke-command -ComputerName . -ArgumentList $jobName -ScriptBlock {
param([string]$jobName)
Import-Module PSScheduledJob;
(get-job -Name Midwinter.AdviceOS.Reports.SsfsTestA).InstanceId.Guid
})
Compare-Object $localInstanceIds $remoteInstanceIds
Note in the output that there are two jobs returned from each method but that none of the InstanceIds match up.
InputObject SideIndicator
----------- -------------
780df0f3-bfc1-4fef-80f5-730e8cbcc548 =>
44109202-cb9b-43cb-8447-7e48905f10a7 =>
4fe8efaa-3107-43cd-a55e-b457b3be5993 <=
ad3bac3f-9f80-4ccd-b8d9-60c1b49fcf6a <=
I also found that the scheduled job instances are saved on the file system within the profile of the user that created them.
When you create a scheduled job, Windows Powershell creates a
directory for the scheduled job in the
$home\AppData\Local\Microsoft\Windows\PowerShell\ScheduledJobs
directory on the local computer. The directory name is the same as the
job name.
https://technet.microsoft.com/en-us/library/hh849673.aspx
The following script retrieves all of the InstanceIds from the Status.xml files for a given job from within that path and compares them with the InstanceIds returned using the Get-Job cmdlet.
Import-Module PSScheduledJob;
$paths = "C:\Users\$userName\AppData\Local\Microsoft\Windows\PowerShell\ScheduledJobs\$jobName\Output\*\Status.xml";
$namespace = #{ns = 'http://schemas.datacontract.org/2004/07/Microsoft.PowerShell.ScheduledJob'};
$instanceIdsFromFileSystem = (Select-Xml -Path $paths -XPath '/ns:StatusInfo/Status_InstanceId/text()' -Namespace $namespace).Node.Value
$instanceIdsFromCmdlet = (Get-Job -Name $jobName).InstanceId
compare-object $instanceIdsFromFileSystem $instanceIdsFromCmdlet -IncludeEqual
The output again shows no matches.
InputObject SideIndicator
----------- -------------
4fe8efaa-3107-43cd-a55e-b457b3be5993 =>
ad3bac3f-9f80-4ccd-b8d9-60c1b49fcf6a =>
e81731dd-4159-476d-937e-7a066d082eb6 <=
5ea34723-ab39-4949-b302-c4be1b1588bb <=
Can somebody please shed some light on this for me?

InstanceId is a GUID that is generated at runtime - it does identify the Job ("globally" on the host, but not universally), but does not identify the Job's contents.
Any Job you execute will have a random GUID assigned, they are not designed to identify any Job persistently across sessions, hosts, or even individual executions. If you execute the same Job twice, they will both have different InstanceId properties. This is by design.
If you want to make a permanent identifier for a Job, you can use the Name property.

Related

How to shutdown the computer after closing the powershell window?

I am new to powershell. I have a powershell script I've been using to backup my files. After it runs, I would like to shutdown the computer and close the powershell window. It seems I can do one or the other, but not both. So when I restart the computer, powershell complains that it was not closed properly.
How to shutdown the computer after closing the powershell window?
TIA
p.s. Contrary to popular belief, I have read the manual. However, as mentioned below, if I put EXIT before Stop-Computer, the script exits before executing Stop-Computer. If I put EXIT after Stop-Computer, powershell complains that the file was not closed properly on reboot. Either way, I lose. :(
PowerShell does provid and 'Exit', as noted in my comment. As for stopping, just put the 'Stop-Computer' cmdlet at the end of your script to shut down the computer.
Get-Help -Name Stop-Computer -examples
# Results
<#
NAME
Stop-Computer
SYNOPSIS
Stops (shuts down) local and remote computers.
----------- Example 1: Shut down the local computer -----------
Stop-Computer -ComputerName localhost
Example 2: Shut down two remote computers and the local computer
Stop-Computer -ComputerName "Server01", "Server02", "localhost"
`Stop-Computer` uses the ComputerName parameter to specify two remote computers and the local computer. Each computer is shut down.
-- Example 3: Shut down remote computers as a background job --
$j = Stop-Computer -ComputerName "Server01", "Server02" -AsJob
$results = $j | Receive-Job
$results
`Stop-Computer` uses the ComputerName parameter to specify two remote computers. The AsJob parameter runs the command as a background job. The job objects are stored in the `$j` variable.
The job objects in the `$j` variable are sent down the pipeline to `Receive-Job`, which gets the job results. The objects are stored in the `$results` variable. The `$results` variable displays the job information
in the PowerShell console.
Because AsJob creates the job on the local computer and automatically returns the results to the local computer, you can run `Receive-Job` as a local command.
------------ Example 4: Shut down a remote computer ------------
Stop-Computer -ComputerName "Server01" -Impersonation Anonymous -DcomAuthentication PacketIntegrity
`Stop-Computer` uses the ComputerName parameter to specify the remote computer. The Impersonation parameter specifies a customized impersonation and the DcomAuthentication parameter specifies authentication-level
settings.
---------- Example 5: Shut down computers in a domain ----------
$s = Get-Content -Path ./Domain01.txt
$c = Get-Credential -Credential Domain01\Admin01
Stop-Computer -ComputerName $s -Force -ThrottleLimit 10 -Credential $c
`Get-Content` uses the Path parameter to get a file in the current directory with the list of domain computers. The objects are stored in the `$s` variable.
`Get-Credential` uses the Credential parameter to specify the credentials of a domain administrator. The credentials are stored in the `$c` variable.
`Stop-Computer` shuts down the computers specified with the ComputerName parameter's list of computers in the `$s` variable. The Force parameter forces an immediate shutdown. The ThrottleLimit parameter limits the
command to 10 concurrent connections. The Credential parameter submits the credentials saved in the `$c` variable.
#>
Or use the Restart-Computer cmdlet, if that is your goal instead.
Update
Use two scripts, main and child.
# Start-Main.ps1
0..4 |
ForEach{
"Inside function... $PSItem"
Start-Sleep -Seconds 1
}
.\Start-Child
Exit
# Start-Child.ps1
'Preparing to shutdown in 10 seconds'
Start-Sleep -Seconds 10
Stop-Computer
or Using PS Jobs is another option as noted in my comment:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/start-job?view=powershell-7.2

How to prevent multiple instances of the same PowerShell 7 script?

Context
On a build server, a PowerShell 7 script script.ps1 will be started and will be running in the background in the remote computer.
What I want
A safenet to ensure that at most 1 instance of the script.ps1 script is running at once on the build server or remote computer, at all times.
What I tried:
I tried meddling with PowerShell 7 background jobs (by executing the script.ps1 as a job inside a wrapper script wrapper.ps1), however that didn't solve the problem as jobs do not carry over (and can't be accessed) in other PowerShell sessions.
What I tried looks like this:
# inside wrapper.ps1
$running_jobs = $(Get-Job -State Running) | Where-Object {$_.Name -eq "ImportantJob"}
if ($running_jobs.count -eq 0) {
Start-Job .\script.ps1 -Name "ImportantJob" -ArgumentList #($some_variables)
} else {
Write-Warning "Could not start new job; Existing job detected must be terminated beforehand."
}
To reiterate, the problem with that is that $running_jobs only returns the jobs running in the current session, so this code only limits one job per session, allowing for multiple instances to be ran if multiple sessions were mistakenly opened.
What I also tried:
I tried to look into Get-CimInstance:
$processes = Get-CimInstance -ClassName Win32_Process | Where-Object {$_.Name -eq "pwsh.exe"}
While this does return the current running PowerShell instances, these elements carry no information on the script that is being executed, as shown after I run:
foreach ($p in $processes) {
$p | Format-List *
}
I'm therefore lost and I feel like I'm missing something.
I appreciate any help or suggestions.
I like to define a config path in the $env:ProgramData location using a CompanyName\ProjectName scheme so I can put "per system" configuration.
You could use a similar scheme with a defined location to store a lock file created when the script run and deleted at the end of it (as suggested already within the comments).
Then, it is up to you to add additional checks if needed (What happen if the script exit prematurely while the lock is still present ?)
Example
# Define default path (Not user specific)
$ConfigLocation = "$Env:ProgramData\CompanyName\ProjectName"
# Create path if it does not exist
New-Item -ItemType Directory -Path $ConfigLocation -EA 0 | Out-Null
$LockFilePath = "$ConfigLocation\Instance.Lock"
$Locked = $null -ne (Get-Item -Path $LockFilePath -EA 0)
if ($Locked) {Exit}
# Lock
New-Item -Path $LockFilePath
# Do stuff
# Remove lock
Remove-Item -Path $LockFilePath
Alternatively, on Windows, you could also use a scheduled task without a schedule and with the setting "If the task is already running, then the following rule applies: Do not start a new instance". From there, instead of calling the original script, you call a proxy script that just launch the scheduled task.

In PowerShell, pass a complex object (SmoServer) to a background job in the ArgumentList - stuck in "NotStarted"

I am trying to pass a pre-built SmoServer object to a background job, to parallelize some operations against multiple SQL Servers. However, when I try to do this, the Child job of the invoked job gets stuck in a "NotStarted" state. A very basic test:
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$SmoServer = New-Object Microsoft.SqlServer.Management.Smo.Server MySqlServer
Start-Job -Name Test -ScriptBlock {
param($SmoServer)
$SmoServer.Databases.Name
} -InitializationScript {
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO"); Import-Module SQLPS -DisableNameChecking
} -ArgumentList $SmoServer
The job starts, but the ChildJob gets stuck "NotStarted"
PS C:\Users\omrsafetyo> Get-Job Test
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
5 Test BackgroundJob Running True localhost param($SmoServer) $Smo...
PS C:\Users\omrsafetyo> Get-Job Test | select -expand childjobs
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
6 Job6 NotStarted True localhost param($SmoServer) $Smo...
I had encountered this a while ago, and never found a solution. And then I came across -IntializationScript, and thought that might be the silver bullet. It doesn't seem it is.
This same behavior is true with Invoke-Command. If I just run Invoke-Command, the command works fine. However, if I run Invoke-Command -AsJob, and pass an SmoServer object, it still fails.
How do I pass these complex objects that need an assembly/module loaded up front in the ArgumentList to a background job?
PowerShell jobs are run in a separate process, and objects passed as arguments are serialized (via something like Export-CliXml or the moral equivalent). When your object is "rehydrated" in the background process, it will not be an instance of Microsoft.SqlServer.Management.Smo.Server anymore, but rather just an object that looks like one (it will have the same properties, as they were serialized in the originating process).
You can pass .NET objects to different runspaces within the same process. An easy way to do this would be to use the PSThreadJob module.
An experiment to demonstrate what happens to objects passed to background jobs:
$srcFi = [System.IO.FileInfo]::new( 'C:\windows\system32\ntdll.dll' )
$srcFi.GetType().FullName
start-job -ScriptBlock { param( $fi ) $fi.GetType().FullName } -Arg #( $srcFi ) | Receive-Job -Wait
Output:
System.IO.FileInfo
System.Management.Automation.PSObject

DSC, having a simple script executed on a target node

I've create a simple MOF file:
Configuration ConfigName
{
Node NB05
{
File ResName
{
}
Edit 1
This is not a mof file, but a file that has to be compiled into a mof file. This will be the focus on another question since this question applies nevertheless.
And I tried to apply it with the command:
PS C:\var\DSC> Start-DscConfiguration
Cmdlet Start-DscConfiguration at position 1 in the command pipeline
Please specify values for the follwing parameters:
Path: .\Configurations
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
1 Job1 Configuratio... Running True c1 Start-DscConfiguration
Questions
It says "runnning" but how can I determine that it has finished?
Even if I make a mistake in the config file, say that i write NNNode, it doesn't give an error at all, but says "Running" as well. How is it supposed to work?
The other cmdlets associated with this cmdlet are located here:
https://msdn.microsoft.com/powershell/reference/5.1/PSDesiredStateConfiguration/Start-DscConfiguration?f=255&MSPPError=-2147217396
Get-DscConfiguration
Get-DscConfigurationStatus
Restore-DscConfiguration
Stop-DscConfiguration
Test-DscConfiguration
Update-DscConfiguration
This cmdlet by default runs as a job. This will run in the background. If you want it to run interactively, use the -wait parameter.
start-dscconfiguration -path "c:\example\configurations" -wait
To view further information about the job use:
get-job -name job1
The job will run periodically to keep the desired state of the system.
Hope this helps.
Thanks, Tim.

Powershell - Manually trigger a scheduled job

I have created a powershell job that I can see when I use the following command:
PS C:\WINDOWS\system32> Get-ScheduledJob -Name KillTicker | Get-JobTrigger
Id Frequency Time DaysOfWeek Enabled
-- --------- ---- ---------- -------
1 AtStartup True
As this is a startup job I really don't fancy restarting to test it out - how to do I manually start this job?
If you run Get-ScheduledJob -id 1 | Get-Member to retrieve all Members of a Microsoft.PowerShell.ScheduledJob.ScheduledJobDefinition you will see that the object expose a method called StartJob:
(Get-ScheduledJob -id 1).StartJob()
To retrieve the result, use the Receive-Job cmdlet:
Receive-Job -id 1
I think it is worth it to mention that Start-Job or StartJob() both run the defined job from the current security context. If the job runs as different user or accesses network resources, you might get unexpected results.
To get the same behavior use either (Get-ScheduledJob -Name xxx).RunAsTask() or Start-ScheduledTask -TaskName xxx -TaskPath xxx. The latter provides a -CimSession option and could be better for remote operations.
You can use the Start-Job cmdlet and supply the name of the Scheduled Job as the
-DefinitionName parameter.
Start-Job -DefinitionName *MyJobName*