PowerShell query Windows task scheduler for tasks that will run between specific datetimes - powershell

I've inherited a server that runs Windows scheduled tasks for several hundred various processes (mostly it kicks off custom PowerShell scripts) - the schedules for these tasks can be as frequently as every 15 minutes, or as infrequently as once per year on a specific date.
How can I query task scheduler using PowerShell (or really any other means also) to determine what if any tasks will be running between a specific date-time range in the future? This is necessary so we can for example schedule and perform maintenance on this server, and others that it interacts with.

I should start out by mentioning this is a deceptively complex topic. There are few reasons for this, including but not limited to:
Scheduled Tasks as objects can be very complex with widely variable schedules, repetition intervals, other trigger types, and a plethora of other configuration points.
The non-GUI, meaning CLI, cmdlet, and/or API tools have changed over the years. And, it's not always easy to string them together to solve the complexities cited in point 1.
WARNING: This report is going to be very tough to write. Before you embark on a challenge like this you should look for any preexisting tools that may inventory or lend other transparency to the Task Scheduler.
Also, it may be better to define the maintenance window as a matter of policy. Then you can configure jobs to avoid the window. At a glance, this strikes me as much easier than writing a difficult program to assist in the cherry-picking of a time slot.
If you choose to move forward here are a few tools that may help:
Schtasks.exe: A surprisingly capable CLI tool that's been around a long time.
The TaskScheduler PowerShell module: A PowerShell module written around scheduling COM objects. It was originally part of the Windows 7 PowerShell pack, but these days you can get it through the PowerShell Gallery. The easiest way is through typical module cmdlets like Find-Module & Install-Module
The newer ScheduledTasks module installed by default on later Windows systems. It's Written around Cim (WMI) and is installed by default in later versions of PowerShell / Windows.
CAUTION: It's easy to confuse the latter 2 tools. Not only are the names are very similar, but there are overlap and similarity between the commands they contain. For example, both modules have a Get-Scheduledtask cmdlet. Two ways to deal with the confusion:
When importing a Module with the Import-Module cmdlet use the -Prefix parameter to add a prefix to the none part of the cmdlet. then use that prefix when calling the cmdlet thereafter.
Call cmdlets with a qualified name like TaskScheduler\GetScheduledTasks
Now to get at the scheduling data. In my experience, the needed details are only exposed through the task's XML definition. Again, you're going to bump up against the complexity that comes with a wide range of scheduling and/or trigger options. You can read about the XML schema here
Here are some examples of how to access the XML data:
schtasks.exe /query /s pyexadm1 /tn <TaskName> /XML
In this case, you'll have to do additional string manipulation to isolate the XML then cast it to [XML] so you can work with it in a typical PowerShell manner. Obviously, there will be challenges to leveraging this tool more broadly. However, it's very handy to know for quick checks and work, especially where the next tool is not immediately available.
Note: if you don't cite the /TN argument all tasks will be returned. While the next method is easier, it's good to know this approach, it will be handy while you are developing.
The next example uses the older TaskScheduler module (#2 above):
$TaskXML = [XML](TaskScheduler\Get-ScheduledTask -ComputerName <ComputerName>-Name <TaskName>).XML
Note: Above assumes no prefix was used. So, you must cite the source module to prevent confusion with the ScheduledTask module.
This example loads the XML text and converts it to an XmlDocument object in a single line. Then you can access data about the task like below:
$TaskXML.Task.Triggers.CalendarTrigger
This may yield output like:
StartBoundary Enabled ScheduleByWeek
------------- ------- --------------
2020-09-14T08:00:00 true ScheduleByWeek
You can run this in mass by leveraging the pipeline, which might look something like below:
$XMLTaskData =
TaskScheduler\Get-ScheduledTask -ComputerName <ComputerName> -Recurse |
ForEach-Object{ [XML]$_.XML }
In the above pipeline example the resulting $XMLTaskData is an array each element of which is a respective XML task definition.
Note: Use of the -Recurse switch parameter. Given the high number of tasks, I wouldn't be surprised if they were organized into subfolders.
Similarly, you can also use the Export-ScheduledTask cmdlet from the ScheduledTasks module:
$TaskXML = [XML](Export-ScheduledTask -CimSession <ComputerName> -TaskName <TaskName>)
And you can leverage the pipeline like this:
$XMLTaskData =
Get-ScheduledTask -CimSession <ComputerName> |
Export-ScheduledTask |
ForEach-Object{ [XML]$_ }
Like the other piped example, this results in an array of XML task definitions.
Note: In this case, there is no -Recurse parameter. You can specifically cite paths though.
With any of these approaches you obviously need some familiarity with working with XML objects in PowerShell, but there are tons of tutorials or other resources for that.
Again, the complexity here is in dealing with many trigger types and scheduling paradigms. On your road to getting a Minimally Viable Program (MVP), you may want to use these techniques to inventory the existing tasks. That can help you prioritize your development process.
A final point; knowing when a task is going to run may be quite different than know when it's running. For example, a task may run a 1:00 PM, but the duration of the job is variable and unaccounted for. That strikes me as very difficult to contend with. You may need another procedure to look for task completion events in the event logs. You may also need to consider execution time limits which can be found in the XML data.

Here is a script to get the next running jobs
Get-ScheduledTask |
Foreach-object { Get-ScheduledTaskInfo $_} |
where-object {!($_.NextRunTime -eq $null)} |
Select-object Taskname, NextRunTime |
Sort-object -property NextRunTime

Related

How to ensure that a logging Cmdlet is called for every exception in a PowerShell script module?

In a PowerShell (5.1 and later) Script Module, I want to ensure that every Script and System exception which is thrown calls a logging Cmdlet (e.g. Write-Log). I know that I can wrap all code within the module Cmdlets into try/catch blocks, but my preferred solution would be to use trap for all exceptions thrown while executing Cmdlets of the Module
trap { Write-Log -Level Critical -ErrorRecord $_ }
Using the above statement works as intended if I add it to each Cmdlet inside the module, but I would like to only have one trap statement which catches all exceptions thrown by Cmdlets of the Module to not replicate code and also ensure that I do not miss the statement in any Cmdlet. Is this possible?
What I would do is this.
Set multiple Try/Catch block as needed.
Group multiple cmdlet calls under the same block when you can. As you mentionned, we don't want to group everything under 1 giant try/catch block but still, related calls can go together.
Design your in-module functions as Advanced functions, so you can make use of the common parameters, such as... -ErrorAction
Set $PSDefaultParameterValues = #{'*:ErrorAction'='Stop'} so all cmdlets that support -ErrorAction don't fall through the try/catch block.
(You could also manually set -ErrorAction Stop everywhere but since you want this as default, it make sense to do it that way. In any cases You don't want to touch $ErrorActionPreference as this has a global scope amd your users won't like you if you change defaults outside of the module scope.)
You can also redirect the error stream to a file so instead of showing up in the output, it is written to a file.
Here's a self contained example of this:
& {
Write-Warning "hello"
Write-Error "hello"
Write-Output "hi"
} 2>> 'C:\Temp\redirection.log'
See : About_Redirection for more on this.
(Now I am wondering if you can redirect the stream to something else than a file
Additional note
External modules can help with logging too and might provide a more streamlined approach.
I am not familiar with any of them though.
I know PSFramework have some interesting stuff regarding logging.
You can take a look and experiment to see if fit your needs.
Otherwise, you can do a research on PSGallery for logging modules
(this research is far from perfect but some candidates might be interesting)
find-module *logging* | Select Name, Description, PublishedDate,Projecturi | Sort PublishedDate -Descending

Get all references to a given PowerShell module

Is there a way to find a list of script files that reference a given module (.psm1)? In other words, get all files that, in the script code, use at least 1 of the cmdlets defined in the module.
Obviously because of PowerShell 3.0 and above, most of my script files don't have an explicit Import-Module MODULE_NAME in the code somewhere, so I can't use that text to search on.
I know I can use Get-ChildItem -Path '...' -Recurse | Select-String 'TextToSearchFor' to search for a particular string inside of files, but that's not the same as searching for any reference to any cmdlet of a module. I could do a search for every single cmdlet in my module, but I was wondering if there is a better way.
Clarification: I'm only looking inside of a controlled environment where I have all the scripts in one file location.
Depending on the scenario, the callstack could be interesting to play around with. In that case you need to modify the functions which you want to find out about to gather information about the callstack at runtime and log it somewhere. Over time you might have enough logs to make some good assumptions.
function yourfunction {
$stack = Get-PSCallStack
if ($stack.Count -gt 1) {
$stack[1] # log this to a file or whatever you need
}
}
This might not work at all in your scenario, but I thought I throw it in there as an option.

Microsoft's Consistency in PowerShell CmdLet Parameter Naming

Let's say I wrote a PowerShell script that includes this commmand:
Get-ChildItem -Recurse
But instead I wrote:
Get-ChildItem -Re
To save time. After some time passed and I upgraded my PowerShell version, Microsoft decided to add a parameter to Get-ChildItem called "-Return", that for example returns True or False depending if any items are found or not.
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected? I understand Microsoft's attempt to save my typing time, but this is my concern and therefore I will probably always try to write the complete parameter name.
Unless of course you know something I don't. Thank you for your insight!
This sounds more like a rant than a question, but to answer:
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected?
Yes!
You should always use the full parameter names in scripts (or any other snippet of reusable code).
Automatic resolution of partial parameter names, aliases and other shortcuts are great for convenience when using PowerShell interactively. It lets us fire up powershell.exe and do:
ls -re *.ps1|% FullName
when we want to find the path to all scripts in the profile. Great for exploration!
But if I were to incorporate that functionality into a script I would do:
Get-ChildItem -Path $Home -Filter *.ps1 -Recurse |Select-Object -ExpandProperty FullName
not just for the reasons you mentioned, but also for consistency and readability - if a colleague of mine comes along and maybe isn't familiar with the shortcuts I'm using, he'll still be able to discern the meaning and expected output from the pipeline.
Note: There are currently three open issues on GitHub to add warning rules for this in PSScriptAnalyzer - I'm sure the project maintainers would love a hand with this :-)

Is the PowerShell Pipeline a continuous feed?

Do the items being sent into the pipeline from a CmdLet get passed into the next CmdLet immediately, or do they go into an internal PS 'accumulator' that stores them up until the current CmdLet completes and then feeds them into the next?
Is there a parameter that can configure continuous mode? If a CmdLet takes single objects, it can be called as data comes into the pipeline, but if a CmdLet takes an array, perhaps PS accumulates and sends the whole array in. Sorting needs a complete set, for example.
I'm interested in writing long-running, open-ended, streaming style CmdLets.
Thanks. I can't find any in-depth discussion on the pipeline.
First Part: No, PowerShell functions do not necessarily get passed into a cmdlet immediately but instead go into a special variable called $input.
$input is populated with all the incoming pipeline objects before a function begins to execute. If necessary, PowerShell may even delay execution until all incoming pipeline data is complete.
Second Part: If you'd like to continuously process your data instead of waiting for it to come in and be stored in $input, you can use the filter keyword in place of the function key word when writing your functions/filters (filters behave similarly to functions except they don't wait for data, but use it as it becomes available).
e.g.
filter Get-OddEven {
$x = $_ % 2
# ... other logic here
}
Hope this helps, lest I plagiarize though, all of this is gotten from the book "Microsoft Windows PowerShell Programming For the Absolute Beginner" by Jerry Lee Ford, Jr should you want to take a further look.

Are there good references for moving from Perl to Powershell?

I hate to say it, but powershell is really annoying me. I just cannot seem to get my mind around it. I have an O'Reilly book on the topic, and I can see how to do some extremely powerful stuff -- it's the easy stuff I can't seem to get right.
Case in point: Iterate across a list of filenames.
In CMD:
for /F %x in ('dir EXPRESSION') do #(
arbitrary-action %x
)
In Perl:
#foo=glob("*");
foreach (#foo)
{
arbitrary-command $_ ;
}
In Powershell:
I'm dumbfounded. I can't seem to get it right.
But I am not sending this post so somebody can tell me the answer. I don't want the answer to this simple question. I want to know how to figure it out, and Google/Bing searches are just not cutting it. The get-help functionality is powershell is nice, but it's still not enough.
I've been programming for 20 years.
I've learned BASIC, Fortran, C, Pascal, Perl, C++, C#, BASH and CMD scripting...And never have I had the trouble I'm having with Powershell.
Are there no references "out there" for migrating from Perl to Powershell? It seems like such a straightforward thing to publish, but I have yet to find one.
Any help would be appreciated. Thanks.
Update:
Okay, so maybe this wasn't the best example to post.
I think I was thrown off by the fact that when I tried gci interactively, I got a directory listing, where what I wanted was an array of strings.
I took a leap of faith and tried:
foreach ($foo in gci "*") {
echo $foo;
}
And yeah, it worked. And yes, I can continue to do searches to piece my way through. I guess I was just hoping to find a guide that makes use of the similarity to languages I already know. I know that Microsoft published a VBScript-to-Powershell guide, so I was hoping for a Perl equivalent.
Thanks again
I've never seen such a guide. I've seen something to help people going from VBScript to PowerShell.
Bruce Payette's PowerShell in Action does have a few pages on PowerShell vs X scripting language, but that won't cut it for a conversion guide.
Now, there was a site out there that had all kinds of constructs in multiple languages, thus providing a task, and then going about solving it in all kinds of languages based on answers from the community... Anyone know what I'm talking about?
I don't know of any good Perl to Powershell comparisons but I can answer your secondary question.
$files = get-childitem "c:\test\" -filter *.dll
foreach ($file in $files) { $file.Name }
There are two different ways you can express a foreach loop in Powershell.
You can pipe an array of objects into foreach and $_ becomes the current object on each iteration.
0,1,2,3 | foreach { $_ }
Alternatively, you can pass a variable to iterate over.
foreach ($num in 0,1,2,3) { $num }
Output in both cases.
0
1
2
3
Like he said himself, he realized the potential to get some very powerful things using powershell...I recently started using it myself and the ease with which things can be done is astounding...just a few lines is all it takes to accomplish things that in python woulda taken me some extra workarounds etc.
I'm curious do you have the powershell cookbook? I thought since I had programming experience that it would be the best way to quickly learn powershell. This turned out to be a poor assumption. Even though the book was really good, I was struggling because I needed a book that was structured more for learning than reference.
The two best free online ebooks I found are:
https://blogs.technet.com/chitpro-de/archive/2007/05/10/english-version-of-windows-powershell-course-book-available-for-download.aspx
http://powershell.com/cs/blogs/ebook/
I'm still looking for a good print book.
I think finding a Perl to PowerShell guide is going to be difficult. It is actually more accurate to compare PowerShell to BASh and the C Shell than Perl. I think what makes learning PowerShell difficult from most of the available books is that they are aimed at system admins, not programmers. I recommend the "PowerShell in Action" for the best coverage of PowerShell as a general purpose programming language.
The other thing you need to do is immediately embrace the core principal of PowerShell -- you are dealing with objects, not text. With BASh and the other Unix shells, it's all about text manipulation and while Perl can do objects, its roots are still very much in the Unix shell and command line utilities (grep, sed, awk, etc.).
Larry Wall stole a lot of great ideas from other languages when he created Perl. PowerShell has done the same, using the POSIX shell as its starting point. You can see a lot of Perl in PowerShell, as well as other languages. I think the best way to learn PowerShell is by having a PowerShell window in front of you while reading "PowerShell in Action" which will help you get into the PowerShell way of thinking about how it does objects. It is easy to interactively enter code snippets in a PowerShell window and examine the properties and methods available within the objects returned by the commands.
BTW -- if you are use to using BASh with the default command line editing features, put the following command in your PowerShell $PROFILE
Set-PSReadlineOption -editmode Emacs
Do this and you'll feel right at home. Now that I can run PowerShell on Linux and the Mac, I'm not sure what I will ever need BASh for again.
Funny... I had the same issues when I started with PowerShell, but after using PowerShell now for a couple of months, I have dumped Perl like ugly Sally after the Prom. There are several great ways of using foreach to loop through an list of objects. Let's look at the following example where I have a number of services that I want to make sure are running on my Windows Server (the processes all have Equal in the name). I can do this with a one liner command as follows:
Get-Service | where-object {$_.displayname -like "Equal*"} | foreach {
if($_.Status -eq "Stopped") {
Write-Host "`nRestarting..."
write-host $_.DisplayName
Start-Service $_.name
}
}
The first part of the command is the get-service command - filtering the services looking for services with the display name with Equal in them. The next part of the one liner really shows the beauty of PowerShell. The get-service command returns a list of objects, which can then be acted upon by piping them into a foreach loop. Note I did not have to declare any variables or an array to hold the objects. The objects are stored in the default $_ variable from which I can pull out object properties like name and status. Foreach returned object (service) I check its status to see if it's status is stopped and if it is it is restarted with the Start-Service command. I could do this a similar action with Perl, but I would have to screen scrape and place the values into an array, then foreach over the array in a similar manner as I did above. The screen scrapping would involve grep with regexp which adds lines of code. Because PowerShell commands produce objects with properties, I can quickly drill down to the properties I am looking for without the screen scraping hassles.