I have created a powershell job that I can see when I use the following command:
PS C:\WINDOWS\system32> Get-ScheduledJob -Name KillTicker | Get-JobTrigger
Id Frequency Time DaysOfWeek Enabled
-- --------- ---- ---------- -------
1 AtStartup True
As this is a startup job I really don't fancy restarting to test it out - how to do I manually start this job?
If you run Get-ScheduledJob -id 1 | Get-Member to retrieve all Members of a Microsoft.PowerShell.ScheduledJob.ScheduledJobDefinition you will see that the object expose a method called StartJob:
(Get-ScheduledJob -id 1).StartJob()
To retrieve the result, use the Receive-Job cmdlet:
Receive-Job -id 1
I think it is worth it to mention that Start-Job or StartJob() both run the defined job from the current security context. If the job runs as different user or accesses network resources, you might get unexpected results.
To get the same behavior use either (Get-ScheduledJob -Name xxx).RunAsTask() or Start-ScheduledTask -TaskName xxx -TaskPath xxx. The latter provides a -CimSession option and could be better for remote operations.
You can use the Start-Job cmdlet and supply the name of the Scheduled Job as the
-DefinitionName parameter.
Start-Job -DefinitionName *MyJobName*
Related
I am trying to pass a pre-built SmoServer object to a background job, to parallelize some operations against multiple SQL Servers. However, when I try to do this, the Child job of the invoked job gets stuck in a "NotStarted" state. A very basic test:
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$SmoServer = New-Object Microsoft.SqlServer.Management.Smo.Server MySqlServer
Start-Job -Name Test -ScriptBlock {
param($SmoServer)
$SmoServer.Databases.Name
} -InitializationScript {
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO"); Import-Module SQLPS -DisableNameChecking
} -ArgumentList $SmoServer
The job starts, but the ChildJob gets stuck "NotStarted"
PS C:\Users\omrsafetyo> Get-Job Test
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
5 Test BackgroundJob Running True localhost param($SmoServer) $Smo...
PS C:\Users\omrsafetyo> Get-Job Test | select -expand childjobs
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
6 Job6 NotStarted True localhost param($SmoServer) $Smo...
I had encountered this a while ago, and never found a solution. And then I came across -IntializationScript, and thought that might be the silver bullet. It doesn't seem it is.
This same behavior is true with Invoke-Command. If I just run Invoke-Command, the command works fine. However, if I run Invoke-Command -AsJob, and pass an SmoServer object, it still fails.
How do I pass these complex objects that need an assembly/module loaded up front in the ArgumentList to a background job?
PowerShell jobs are run in a separate process, and objects passed as arguments are serialized (via something like Export-CliXml or the moral equivalent). When your object is "rehydrated" in the background process, it will not be an instance of Microsoft.SqlServer.Management.Smo.Server anymore, but rather just an object that looks like one (it will have the same properties, as they were serialized in the originating process).
You can pass .NET objects to different runspaces within the same process. An easy way to do this would be to use the PSThreadJob module.
An experiment to demonstrate what happens to objects passed to background jobs:
$srcFi = [System.IO.FileInfo]::new( 'C:\windows\system32\ntdll.dll' )
$srcFi.GetType().FullName
start-job -ScriptBlock { param( $fi ) $fi.GetType().FullName } -Arg #( $srcFi ) | Receive-Job -Wait
Output:
System.IO.FileInfo
System.Management.Automation.PSObject
I'm trying to understand what's going on here in Powershell (v2.0, if important). I'm capturing the results of a command to a variable and when I write it to the console, I'm not getting the results I expect. Everything but the output is functioning as expected.
This is an MCVE that acts in the same way as a script that I wrote. I've just broken it down so that I can provide commentary on what's happening, where it's not working the way I think it should work, and what I think may be happening.
In this first snippet, I'm validating the status of the service MyService
on computer svr0123. This gives the output that I'm expecting.
PS C:\Temp> Get-Service -Name MyService -CN svr0123
Status Name DisplayName
------ ---- -----------
Stopped MyService My_Service_Display_Name
PS C:\Temp>
In this second snippet, I'm doing the same, only assigning the output to
$results with the intention of restarting any stopped services. Again, this
gives the output I'm expecting.
PS C:\Temp> $results = Get-Service -Name MyService -CN svr0123
PS C:\Temp> Write-Output $results
Status Name DisplayName
------ ---- -----------
Stopped MyService My_Service_Display_Name
PS C:\Temp>
Finally, I'm restarting the service, then writing the contents of $results
to the console. This does not function as expected. I would anticipate
that the contents of $results would be the same as the previous two outputs,
but instead I get:
PS C:\Temp> $results | Where { $_.Status -eq "Stopped" } | Set-Service -Status Running
PS C:\Temp> Write-Output $results
Status Name DisplayName
------ ---- -----------
Running MyService My_Service_Display_Name
PS C:\Temp>
This is incorrect unless each time I reference the contents of
$results it is calling the Get-Service command again, which is counterintuitive. If that's the case, it
appears that I'm not storing the output of the command, but rather I'm
storing an alias to the command. If I write the contents of $results to the console before doing the restart, everything outputs as expected.
This is a trivial fix, but I'm trying to understand the "Why" behind what I'm observing. My questions are:
Is this, in fact, what is occurring? Where in the Powershell documentation
can I learn more about this?
If this is what is occurring, is there a way that I can just store the
output so that I'm not incurring multiple calls? It's trivial in this
case, but my script will be used on a busy network and may at times have
to query hundreds of servers in a given run.
When you call Get-Service -Name MyService -CN svr0123 it is returning a ServiceController object, not just the text of the output.
So, when you call this line:
$results = Get-Service -Name MyService -CN svr0123
$results is not just a string variable containing the output of the command. It is actually a variable of type ServiceContoller.
Run this to see:
$results.GetType()
You will get this:
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True False ServiceController System.ComponentModel.Component
So, the second time you execute Write-Output $results, the service has been started and the $results variable is displaying the current status of the service.
I think it does not re-call Get-Service when you display the service object. I think the $service object has an internal state which is a text value, and when you pipe it to Set-Service that state value gets changed as well as the service being started/stopped.
Since trying to change it directly with $service.Status = "Running" generates an error because the property is read-only, this change could be happening through the service object's own $service.Start() and $service.Stop() methods found from $service | get-member
Supporting evidence from some quick tests:
I $service = get-service testsvc; $service and see the state is Running, then I go to Control Panel and stop the service there, and the $service state does not change.
I call $service.Start() directly to restart the service, and I get an exception (possibly because my PowerShell is not running as an Admin) so the service does not actually start running, however $service.Status does change (incorrectly) to say the service is running.
This way I get a disconnect between the reported status and the actual status is convincing me, and the way it seems implausible/impractical for every object to "know" how it was generated and arbitrarily re-run that code (what if it was a 30 minute query to generate it?) but I don't know for sure what the interactions are.
Somehow I can't see scheduled jobs created by another user with get-scheduled job (can see them using task scheduler in powershell\scheduledjobs). What is the trick? powershell v3.
I've got some solution for you.
As you notice you can see all Scheduled Job into Task Scheduler. So I Run :
Get-ScheduledTask
and then
Get-ScheduledTask -TaskPath "\Microsoft\Windows\PowerShell\ScheduledJobs\"
So You can see all your PowerShell Jobs. I you really want ot retreive ScheduledJob objects then, you can run :
Get-ScheduledTask -TaskPath "\Microsoft\Windows\PowerShell\ScheduledJobs\" | Select-Object -ExpandProperty Actions | Select-Object -ExpandProperty Arguments | % {$_ -match '^.*= (.*);.*$'; Invoke-Expression $Matches[1]}
I just put some glue to retreive the jobs from the launch of each task.
I am trying to manage instances of scheduled jobs using the Job cmdlets such as Get-Job and Wait-Job. I am having problems and have the feeling that I am missing something fundamental.
I have found that I should be importing the PSScheduledJob module if I want to work with scheduled job instances using the Job cmdlets.
NOTE: To use the Job cmdlets on instances of scheduled jobs, the PSScheduledJob module must be imported into the session. To import the PSScheduledJob module, type "Import-Module PSScheduledJob" (without quotation marks) or use any Scheduled Job cmdlet, such as Get-ScheduledJob.
https://technet.microsoft.com/en-us/library/hh850536.aspx
I also found that despite what I expected, the Id property on a job instance does not identify it globally - but rather within a single Powershell session.
The ID is an integer that uniquely identifies the job within the current session. It is easier to remember and to type than the instance ID, but it is unique only within the current session.
https://technet.microsoft.com/en-us/library/hh849693.aspx
The InstanceId is supposed to be globally unique which sounds good...
An instance ID is a GUID that uniquely identifies the job on the
computer.
https://technet.microsoft.com/en-us/library/hh849693.aspx
...but I seem to be getting the same job instance returning different InstanceIds. The following script gets the InstanceIds for jobs both directly and via remoting.
Import-Module PSScheduledJob;
$localInstanceIds = ( get-job -Name $jobName).InstanceId.Guid
$remoteInstanceIds = (invoke-command -ComputerName . -ArgumentList $jobName -ScriptBlock {
param([string]$jobName)
Import-Module PSScheduledJob;
(get-job -Name Midwinter.AdviceOS.Reports.SsfsTestA).InstanceId.Guid
})
Compare-Object $localInstanceIds $remoteInstanceIds
Note in the output that there are two jobs returned from each method but that none of the InstanceIds match up.
InputObject SideIndicator
----------- -------------
780df0f3-bfc1-4fef-80f5-730e8cbcc548 =>
44109202-cb9b-43cb-8447-7e48905f10a7 =>
4fe8efaa-3107-43cd-a55e-b457b3be5993 <=
ad3bac3f-9f80-4ccd-b8d9-60c1b49fcf6a <=
I also found that the scheduled job instances are saved on the file system within the profile of the user that created them.
When you create a scheduled job, Windows Powershell creates a
directory for the scheduled job in the
$home\AppData\Local\Microsoft\Windows\PowerShell\ScheduledJobs
directory on the local computer. The directory name is the same as the
job name.
https://technet.microsoft.com/en-us/library/hh849673.aspx
The following script retrieves all of the InstanceIds from the Status.xml files for a given job from within that path and compares them with the InstanceIds returned using the Get-Job cmdlet.
Import-Module PSScheduledJob;
$paths = "C:\Users\$userName\AppData\Local\Microsoft\Windows\PowerShell\ScheduledJobs\$jobName\Output\*\Status.xml";
$namespace = #{ns = 'http://schemas.datacontract.org/2004/07/Microsoft.PowerShell.ScheduledJob'};
$instanceIdsFromFileSystem = (Select-Xml -Path $paths -XPath '/ns:StatusInfo/Status_InstanceId/text()' -Namespace $namespace).Node.Value
$instanceIdsFromCmdlet = (Get-Job -Name $jobName).InstanceId
compare-object $instanceIdsFromFileSystem $instanceIdsFromCmdlet -IncludeEqual
The output again shows no matches.
InputObject SideIndicator
----------- -------------
4fe8efaa-3107-43cd-a55e-b457b3be5993 =>
ad3bac3f-9f80-4ccd-b8d9-60c1b49fcf6a =>
e81731dd-4159-476d-937e-7a066d082eb6 <=
5ea34723-ab39-4949-b302-c4be1b1588bb <=
Can somebody please shed some light on this for me?
InstanceId is a GUID that is generated at runtime - it does identify the Job ("globally" on the host, but not universally), but does not identify the Job's contents.
Any Job you execute will have a random GUID assigned, they are not designed to identify any Job persistently across sessions, hosts, or even individual executions. If you execute the same Job twice, they will both have different InstanceId properties. This is by design.
If you want to make a permanent identifier for a Job, you can use the Name property.
I have a small Powershell script that is used to shut down my virtual machines in event of an extended power outage. It takes a specific VM object and forces a shutdown.
Function DirtyShutdown
{ param([VMware.VimAutomation.ViCore.Impl.V1.Inventory.VirtualMachineImpl]$VM )
$VM | Stop-VM -Confirm:$false
}
I would like to speed up this process using the start-job command to run all these tasks in parallel. I have tried using several variants including the following which I believe to be correct.
Start-Job -InputObject $VM -ScriptBlock{ $input | Shutdown-VMGuest -Confirm:$false }
Based on the Receive-Job output it appears the problem is the snap in in use (added before the above function is called) is not loaded in the context of Start-Job.
What is the correct syntax to make this happen?
While I appreciate the desire to use PowerShell v2's job subsystem for this task, note that vCenter has a built-in job system which you can take advantage of here. Most PowerCLI cmdlets which perform a change to your environment have a RunAsync parameter. To know which ones, run this piece of PowerShell code:
get-help * -parameter runasync
The RunAsync parameter will take your command(s) and queue them up in vCenter. The cmdlet will return a task object and then immediately return control back to your script.
To turn this into an answer in your case, simply append "-runasync" to the end of your Stop-VM command, like so:
$VM | Stop-VM -Confirm:$false -RunAsync
Each time you start a job, PowerShell creates a new runspace. This means a new environment that you may need to initialize, and that includes loading snap-ins and connecting to your VI Server. Start-Job has a parameter that you can use here called InitializationScript. Try something like this:
Start-Job -InitializationScript { Add-PSSnapin VMware.VimAutomation.Core } {
Connect-ViServer myserver
Get-VM foo | Stop-VM
}