Powershell novice looking to create script to trigger notifications on missing files - powershell

Long time lurker first time poster. I'm looking(of my own initiative) to see if there is a method by which I can check for missing files, that we would expect to receive on a daily basis, and be notified via e-mail.
Our company has what I'd call a relatively unhinged systems infrstructure, that since I arrived I've been chipping away here and there putting in some practices and process' to be more proactive with our monitoring.
Specifically in this case, we receive files via FTP from a vendor, that outlines our Sales and other data. These files go through some validation and the data is then imported into our ERP platform. However I am interested to put in a check, that raises and alert when a file has not been received, when expected.
The last part of that requirement can potentially change, I'm not sure how specific I can get when trying to raise an alert from an expected file.
I'll outline this by stating I'm a relative novice in this area, but there is really no one in my department any the wiser. So I've been looking into powershell.
I've created the following two bits of codes so far, that when executed appear to return files that have been created/last writ, within the last day. This would even be enough, to have this output sent via e-mail. I would be able to spot quickly if an expected file is not in the list.
GET-ChildItem -Path "Path I am checking" |
Where-Object {$_.LastWritetime -gt (get-Date).AddDays(-1)}
The above returns one .csv file. I guess if I get a returned file, then I know its been provided, and if the return is blank/zero, then I know I didn't get a file.
I've used the above for four seperate checks, checking other subfolders in the structure.
To outline the folder structure
\"App server"\"Region"\"Vendor"
There are then the following subfolders
Purchases
Sales
Tenders
VAT
Each of the above four folders then has
Incoming
Processed
I am running my checks on the processed folder for each of the four folder outlined above.

Maybe something like this will help you out:
Function Test-NewerFiles {
# We use parameters as it makes things easy when we need to change things
# CmdLetBinding makes sure that we can see our 'Write-Verbose' messages if we want to
[CmdLetBinding()]
Param (
[String]$Path = 'C:\Users\me\Downloads\Input_Test',
[String]$ExportFile = 'C:\Users\me\Downloads\Log_Test\Attachment.txt'
)
# We first save the date, then we don't need to do this every time again
$CompareDate = (Get-Date).AddDays(-1)
# Then we collect only the folders and check each folder for files and count them
Get-ChildItem -Path $Path -Directory -Recurse | ForEach-Object {
$Files = (Get-ChildItem -Path $_.FullName -File | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
# If we didn't find files the count is 0 and we report this
if ($Files -eq 0) {
Write-Verbose "No files found in folder $($_.FullName)"
Write-Output $_.FullName
}
# If we found files it's ok and we don't report it
else {
Write-Verbose "Files found in folder $($_.FullName)"
}
}
}
# If you don't want to see output you can remove the '-Verbose' switch
Test-NewerFiles -Verbose
$MyNewFiles = Test-NewerFiles
$MyNewFiles | Out-File -FilePath $ExportFile -Encoding utf8
if ($MyNewFiles) {
$MailParams = #{
To = 'Chuck.Norris#world.com'
From = 'MyScriptServer#world.com'
SmtpServer = 'SMTPServer'
}
Send-MailMessage #MailParams -Priority High -Attachments $ExportFile -Body 'We found problems: check attachment for details'
}
else {
Send-MailMessage #MailParams -Priority Low -Body 'All is ok'
}
The Verbose switch is only used to report progress. So we can see what it does when it's running. But when we use this code in production, we don't need these messages and just use Test-NewerFiles instead of Test-NewerFiles -Verbose.

Related

How to find a file from files via PowerShell?

I had an excel script to search for files in a command.
I found this example on the forum, the statement says that to search for a file by name, you need to write down the name and send (*) but when requested, it does not find anything
Get-ChildItem -Path "C:\\Folder\\test\*"
What can I do to simplify the code and make it much faster. Wait 10 minutes to find a file out of 10000. this is very long
I have a folder with 10,000 files, and excel searches through VBA for a script in almost 2-3 seconds.
When to script in PowerShell via
$find = Get-ChildItem -Path "C:\\Folder"
for ($f=0; $f -lt $find.Count; $f++){
$path_name = $find\[$f\].Name
if($path_name-eq 'test'){
Write Host 'success'
}
}
ut it turns out sooooooo long, the script hangs for 10 minutes and does not respond, and maybe he will be lucky to answer.
How can I find a file by filter using
Get-ChildItem
To make your search faster you can use Get-ChildItem filter.
$fileName = "test.txt"
$filter = "*.txt"
$status = Get-ChildItem -Path "C:\PS\" -Recurse -Filter $filter | Where-Object {$_.Name -match $fileName}
if ($status) {
Write-Host "$($status.Name) is found"
} else {
Write-Host "No such file is available"
}
You could also compare the speed of searching by using Measure-Command
If the disk the data is on is slow then it'll be slow no matter what you do.
If the folder is full of files then it'll also be slow depending on the amount of RAM in the system.
Less files per folder equals more performance so try to split them up into several folders if possible.
Doing that may also mean you can run several Get-ChildItems at once (disk permitting) using PSJobs.
Using several loops to take take care of a related problem usually makes the whole thing run "number of loops" times as long. That's what Where-Object is for (in addition to the -Filter, -Include and -Exclude flags to Get-ChildItem`).
Console I/O takes A LOT of time. Do NOT output ANYTHING unless you have to, especially not inside loops (or cmdlets that act like loops).
For example, including basic statistics:
$startTime = Get-Date
$FileList = Get-ChildItem -Path "C:\Folder" -File -Filter 'test'
$EndTime = Get-Date
$FileList
$AfterOutputTime = Get-Date
'Seconds taken for listing:'
(EndTime - $startTime).TotalSeconds
'Seconds taken including output:'
($AfterOutputTime - $StartTime).TotalSeconds

Getting files in a directory that has over 7 million items using powershell

This is currently what I am trying to execute.
$folderPath = 'M:\abc\WORKFORCE\Media\Attachments'
Write-Host "Executing Script..."
foreach ($file in Get-ChildItem $folderPath -file)
{
# execute code
}
However when I execute the powershell script it freezes on me. It's been this way for an hour now. I'm assuming it might be because the directory has over 8 million items in it. Is there a more efficient way to move these items? Is waiting my only option? Or is it not possible to do this at all with powershell because of how large the directory is?
When you do not need any information except file name, you should use [System.IO.Directory]::EnumerateFiles($folderPath, '*')
EnumerateFiles returns IEnumerable[String].
IEnumerable is a special type that can be used in foreach statements. It does not loads information into memory, but instead it gets next item only when requested. It works almost immediately.
So, your code will be
$filesIEnumerable = [System.IO.Directory]::EnumerateFiles($folderPath,'*')
foreach ($fullName in $filesIEnumerable) {
# code here
$fileName = [System.IO.Path]::GetFileName($fullName)
# more code here
}
In case you want to keep in-memory all list of files instead of iterating once ( for example you need to iterate several times ), EnumerateFiles is still a faster and requires less memory than Get-ChildItem because it does not get any extended file attributes:
$files = #([System.IO.Directory]::EnumerateFiles($folderPath,'*'))
Look further about EnumerateFiles at learn.microsoft.com
Without further explanation of what the end-goal of the script is; there can not really be a solution to this question.
However, a tip on performance, can be given.
Original script:
$folderPath = 'M:\abc\WORKFORCE\Media\Attachments'
Write-Host "Executing Script..."
foreach ($file in Get-ChildItem $folderPath -file)
{
# execute code
}
Suggested approach:
$files = Get-ChildItem 'M:\abc\WORKFORCE\Media\Attachments' -file
$DestinationPath = 'F:\DestinationFolder'
Write-Host "Executing Script..."
$Files | ForEach-Object {
# execute code
# Write-Verbose "Moving $_.Name"
# Move-Item -Destination $DestinationPath
}
That being said, it looks like filimonic's take on an answer has a superior speed to its execution, than my suggestion.
( To expand on that, check this thread)

Read from randomly named text files

I'm finishing a script in PowerShell and this is what I must do:
Find and retrieve all .txt files inside a folder
Read their contents (there is a number inside that must be less than 50)
If any of these files has a number greater than 50, change a flag which will allow me to send a crit message to a monitoring server.
The piece of code below is what I already have, but it's probably wrong because I haven't given any argument to Get-Content, it's probably something very simple, but I'm still getting used to PowerShell. Any suggestions? Thanks a lot.
Get-ChildItem -Path C:\temp_erase\PID -Directory -Filter *.txt |
ForEach-Object{
$warning_counter = Get-Content
if ($warning_counter -gt '50')
{
$crit_counter = 1
Write-Host "CRITICAL: Failed to kill service more than 50 times!"
}
}
but it's probably wrong because I haven't given any argument to Get-Content
Yes. That is the first issue. Have a look at Get-Help <command> and or docs like TechNet when you are lost. For the core cmdlets you will always see examples.
Second, Get-Content, returns string arrays (by default), so if you are doing a numerical comparison you need to treat the value as such.
Thirdly you have a line break between foreach-object cmdlet and its opening brace. That will land you a parsing problem and PS will prompt for the missing process block. So changing just those mentioned ....
Get-ChildItem -Path C:\temp_erase\PID -Directory -Filter *.txt | ForEach-Object{
[int]$warning_counter = Get-Content $_.FullName
if ($warning_counter -gt '50')
{
$crit_counter = 1
Write-Host "CRITICAL: Failed to kill service more than 50 times!"
}
}
One obvious thing missing from this is you do not show which file triggered the message. You should update your notification/output process. You also have no logic validating file contents. The could easily fail, either procedural or programically, on files with non numerical contents.

Pass parameter from one Powershell function to another

I’m new to powershell and development in general. I’m trying to write a script that will email a contact once a file exceeds a certain size. I have two individual functions both working separately (one to check the file size and one to generate a file for sendmail to use) but I can’t get them to interact.
I want to execute the function CheckSize and if the variable $ExceedsSize gets set to 1 then call function SendMail otherwise the script should finish with no other action.
I’ve searched through the forums but couldn’t find anything to apply to what I’m doing.
##Check file to see if it is over a particular size and then send email once threshold is reached.
param(
[string]$SiteName = "TestSite", #Name of Site (No Spaces)
[string]$Path = "\\NetworkPath\Directory", #Path of directory to check
[int]$FileSizeThreshold = 10, #Size in MB that will trigger a notification email
[string]$Contacts = "MyEmail#email.com"
)
CLS
##Creates variable $ExceedsSize based on newest file in folder.
Function CheckSize {
IF ((GCI $Path -Filter *.txt | Sort LastWriteTime -Descending | Select-Object -first 1 | Measure-Object -property Length -sum).sum / 1000000 -gt $FileSizeThreshold) {$ExceedsSize = 1}
ELSE {$ExceedsSize = 0}
Write-Host $ExceedsSize
}
Function SendMail {
Param([string]$Template, [string]$Contacts, [string]$WarnTime)
$EmailLocation = "\\NetworkPath\Scripts\File_$SiteName.txt"
#Will Generate email from params
New-Item $EmailLocation -type file -force -value "From: JMSIssue#emails.com`r
To: $Contacts`r
Subject: $SiteName file has exceeded the maximum file size threshold of $FileSizeThreshold MB`r`n"
#Send Email
#CMD /C "$SendMail\sendmail.exe -t < $EmailLocation"
}
Add this before or after your Write-Host $ExceedsSize:
return $ExceedsSize
Add this to the bottom:
$var = CheckSize
if ($var -eq 1){
SendMail
}
Explanation
You have two functions, but don't actually run them. The part at the bottom does that.
Your CheckSize function does not return the $ExceedsSize for the rest of the function; by default it remains within the scope of the function. return x means the variable is passed back to the main script. $var = means it is assigned ot that variable.
Per the other answer, you need to return $ExceedsSize instead of using Write-Host (see here for why Write-Host is considered harmful: http://www.jsnover.com/blog/2013/12/07/write-host-considered-harmful/).
You could alternatively call the SendMail function from within the CheckSize function, e.g:
if ($ExceedsSize -eq 1){SendMail}
You will still need to call the CheckSize function somewhere also:
CheckSize
You might also want to give consideration to naming your functions in the verb-noun style of the built in cmdlets. This really helps make their use more explicit to you and others. When choosing a verb, its best to stick to the approved list: https://msdn.microsoft.com/en-us/library/ms714428(v=vs.85).aspx
And also to use names that are fairly unique to avoid possible conflicts.
I'd suggest something along the lines of:
Get-NewestFileSize
(although that's what it should then return)
and
Send-CCSMail

Powershell memory exhaustion using NTFSSecurity module on a deep folder traverse

I have been tasked with reporting all of the ACL's on each folder in our Shared drive structure. Added to that, I need to do a look up on the membership of each unique group that gets returned.
Im using the NTFSSecurity module in conjunction with the get-childitem2 cmdlet to get past the 260 character path length limit. The path(s) I am traversing are many hundreds of folders deep and long since pass the 260 character limit.
I have been banging on this for a couple of weeks. My first challenge was crafting my script to do my task all at once, but now im thinking thats my problem... The issue at hand is resources, specifically memory exhaustion. Once the script gets into one of the deep folders, it consumes all RAM and starts swapping to disk, and I eventually run out of disk space.
Here is the script:
$csvfile = 'C:\users\user1\Documents\acl cleanup\dept2_Dir_List.csv'
foreach ($record in Import-Csv $csvFile)
{
$Groups = get-childitem2 -directory -path $record.FullName -recurse | Get-ntfsaccess | where -property accounttype -eq -value group
$groups2 = $Groups | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
$groups3 = $groups2 | select account -Unique
$GroupMembers = ForEach ($Group in $Groups3) {
(Get-ADGroup $Group.account.sid | get-adgroupmember | select Name, #{N="GroupName";e={$Group.Account}}
)}
$groups2 | select FullName,Account,AccessControlType,AccessRights,IsInherited | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name).csv"
$GroupMembers | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name)_GroupMembers.csv"
}
NOTE: The dir list it reads in is the top level folders created from a get-childitem2 -directory | export-csv filename.csv
During the run, it appears to not be flushing memory properly. This is just a guess from observation. At the end of each run through the code, the variables should be getting over-written, I thought, but memory doesn't go down, so it looked to me that since memory didn't go back down, that it wasn't properly releasing it? Like I said, a guess... I have been reading about runspaces but I am confused about how to implement that with this script. Is that the right direction for this?
Thanks in advance for any assistance...!
Funny you should post about this as I just finished a modified version of the script that I think works much better. A friend turned me on to 'Function Filters' that seem to work well here. Ill test it on the big directories tomorrow to see how much better the memory management is but so far it looks great.
#Define the function ‘filter’ here and call it ‘GetAcl’. Process is the keyword that tells the function to deal with each item in the pipeline one at a time
Function GetAcl {
PROCESS {
Get-NTFSAccess $_ | where -property accounttype -eq -value group | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
}
}
#Import the directory top level paths
$Paths = import-csv 'C:\users\rknapp2\Documents\acl cleanup\dept2_Dir_List.csv'
#Process each line from the importcsv one at a time and run GetChilditem against it.
#Notice the second part – I ‘|’ pipe the results of the GetChildItem to the function that because of the type of function it is, handles each item one at a time
#When done, pass results to Exportcsv and send it to a file name based on the path name. This puts each dir into its own file.
ForEach ($Path in $paths) {
(Get-ChildItem2 -path $path.FullName -Recurse -directory) | getacl | export-csv "C:\Users\rknapp2\Documents\acl cleanup\TestFilter\$($path.name).csv" }