summing up average infos with PowerShell - powershell

I need to make a percentage of all my VMs who succeeded their backups but weekly but I'm pretty new with all of this and didn't got any courses or formations with PowerShell.
It's already working daily but what I want is to sum up everything and make a percentage of all the VMs that did their backups.
I wanted the script to start every 24 hours, make a weekly report and every 7 days, send a mail about the results. I already did the mail part but I don't know how to do the rest.
Edit
I already did the average script for every day.
$success_rate = 100 - ($nbckp_vms * 100 / $total_vms)
But now that I have 7 days, I want to make this action 7 times, have the result saved each day in a .txt file and then, at the 7th day, make a success rate every week.
So, of course I know it's something like "all the results / number of results * 100" or something like that but, I can't actually make this work on my PowerShell script.
I have these informations with this part of the script:
# Check backup
$body = "*** VMs not backed up last night ***" + "`r`n" + "`r`n"
$total_vms = 0
$nbckp_vms = 0
foreach ($i in $csv1) {
$total_vms++
$VM = $i.VM
$backup = $i.backup
$today = Get-Date -Format "M/d/yyyy"
$yesterday = (Get-Date).AddDays(-1).ToString("M/d/yyyy")
try {
if ($backup -notlike "*$yesterday*" -and `
$backup -notlike "*$today*" -and `
$backup -notlike "No backup*" -and `
$backup -notlike "TiNa backup*"
) {
#Write-Output "$VM have not been backuped last night."
$nbckp_vms++
$body = $body + "$VM" + "`r`n"
}
} catch {
}
}
What I want is to send to myself a weekly mail about the percentage of VMs that succeeded their backups. This is what a normal mail looks like:
*** VMs not backed up last night ***
Machine1
Machine2
Machine3
Machine4
Machine5
Machine6
Machine7
Machine8
Machine9
Machine10
Machine11
Machine12
Machine13
Machine14
Machine15
Machine16
Machine17
Machine18
Machine19
Machine20
Machine21
Machine22
Machine23
Machine24
Machine25
Machine26
Machine27
Machine28
Machine29
Machine30
Machine31
Machine32
Machine33
Machine34
*** Backup success rate for production KPIs ***
Daily success rate = 94.28%
Total VMs = 594
Daily unbacked up VMs = 34
The mail system works perfectly but I just want a weekly thing.
(I gave the VMs generic names)
This is what I tried so far:
$success_rate_weekly = 100 - (($text[1] += $text[2] += $text[3] += $text[4] += $text[5] += $text[6] += $text[7]) /= 7
get-content "E:\PS\Malik\valeurs.txt" | foreach { -split $_ | select -index 4 } | measure -sum
I found the last one on a french forum but none of these two lines worked for me.

(Posted on behalf of the question author)
Thanks to Ansgar Wiechers who gave me the idea of running two scripts, and to something really dumb I found on the internet, this is what I have:
First, I used Windows Task Scheduler to run my first script daily and recover my daily informations. I also used WTS to run my second script monthly who will make a percentage every month before erasing the content of the .txt file
I used this line:
$success_rate_weekly = get-content "E:\PS\Malik\valeurs.txt" | measure -average | select -expand average
And this line helped me to calculate the average of VMs that succeeded their backups.

Related

Powershell script check modified date of files & send email if changed

I need to run a scheduled task that will trigger at 7am daily and search a folder for any files that have a modified date that’s changed in the last day or 24 hours. I’m stuck on whether what I’ve done so far is the best way to do this check and also I’m not sure how to get this to email out a file with the list of file or files that have changed in the last 24 hours. I don’t think the FileSystemChecker is worth the amount of time it seems to get that running as I’ve read it can be troublesome. I’m trying to do something that just looks for files with a modified date that’s changed. I don’t have to look for deleted files or added files email the folder. If nothing has changed then I need to send the email to a different group of folks than I do if there are files that changed. I’m stuck on how to do the email part. The other part I’m stuck on is getting this to accept a unc path so I can run the task from another server.
Get-Item C:\folder1\folder2\*.* | Foreach { $LastUpdateTime=$_.LastWriteTime $TimeNow=get-date if (($TimeNow - $LastUpdateTime).totalhours -le 24) { Write-Host "These files were modified in the last 24 hours "$_.Name } else { Write-Host "There were no files modified in the last 24 hours" } }
First of all, do not try and cram all code in a single line. If you do, the code gets unreadable and mistakes are easily made, but hard to spot.
What I would do is something like this:
$uncPath = '\\Server1\SharedFolder\RestOfPath' # enter the UNC path here
$yesterday = (Get-Date).AddDays(-1).Date # set at midnight for yesterday
# get an array of full filenames for any file that was last updates in the last 24 hours
$files = (Get-ChildItem -Path $uncPath -Filter '*.*' -File |
Where-Object { $_.LastWriteTime -ge $yesterday }).FullName
if ($files) {
$message = 'These files were modified in the last 24 hours:{0}{1}' -f [Environment]::NewLine, ($files -join [Environment]::NewLine)
$emailTo = 'folskthatwanttoknowaboutmodifications#yourcompany.com'
}
else {
$message = 'There were no files modified in the last 24 hours'
$emailTo = 'folskthatwanttoknowifnothingismodified#yourcompany.com'
}
# output on screen
Write-Host $message
# create a hashtable with parameters for Send-MailMessage
$mailParams = #{
From = 'you#yourcompany.com'
To = $emailTo
Subject = 'Something Wrong'
Body = $message
SmtpServer = 'smtp.yourcompany.com'
# any other parameters you might want to use
}
# send the email
Send-MailMessage #mailParams
Hope that helps

Which part of this Powershell code snippet is making it take a long time to run?

I'm tasked with making a report of the last logon time for each user in our AD env, I obviously first asked mother google for something that I could repurpose but couldn't find anything that would check multiple Domain Controllers and reconcile the last one, and then spit out if it was past an arbitrarily set date/number of days.
Here's the code:
foreach ($user in $usernames) {
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
Write-Progress -Id 3 -Activity "Finding Inactive Accounts" -Status "$($percentCmpUser)% Complete:" -PercentComplete $percentCmpUser
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
$finalLogon = $allLogons| Sort-Object LastLogon -Descending |Select-Object -First 1
if ($finalLogon.LastLogon -lt $time.ToFileTime()) {
$inactiveAccounts += $finalLogon
}
}
$usernames is a list of about 6000 usernames
$AllUsers is a list of 18000 users, it includes 10 different properties that I'd like to have access to in my final report. The way I got it was by hitting three of our 20 or so DC's for all users in specific OUs that I'm concerned with. The final script will actually be 6k*20 bec I do need to hit every DC to make sure I don't miss any user's logon.
Here's how $time is calculated:
$DaysInactive = 60
$todayDate = Get-Date
$time = ($todayDate).Adddays(-($DaysInactive))
Each variable is used elsewhere in the script, which is why I break it out like that.
Before you suggest LastLogonTimestamp, I was told it's not current enough and when I asked about changing the replication time to be more current I was told "no, not gonna happen".
Search-ADAccount also doesn't seem to offer an accurate view of inactive users.
I'm open to all suggestions about how to make this specific snippet run faster or on how to use a different methodology to achieve the same result in a fast time.
As of now hitting each DC for all users in specific OUs takes about 10-20sec per DC and then the above snippet takes 30-40 min.
Couple of things stand out, but likely the biggest performance killer here is these two statements:
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
# and
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
... both of these statements will exhibit O(N^2) (or quadratic) performance characteristics - that is, every time you double the input size, the time taken quadruples!
Array.IndexOf() is effectively a loop
Let's look at the first one:
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
It might not be self-evident, but this method-call: $usernames.IndexOf() might require iterating through the entire list of $usernames every time it executes - by the time you reach the last $user, it needs to go through and compare $user all 6000 items.
Two ways you can address this:
Use a regular for loop:
for($i = 0; $i -lt $usernames.Count; $i++) {
$user = $usernames[$i]
$percent = ($i / $usernames.Count) * 100
# ...
}
Stop outputting progress altogether
Write-Progress is really slow - even if the caller suppresses Progress output (eg. $ProgressPreference = 'SilentlyContinue'), using the progress stream still carries overhead, especially when called in every loop iteration.
Removing Write-Progress altogether would remove the requirement for calculating percentage :)
If you still need to output progress information you can shave off some overhead by only calling Write-Progress sometimes - for example once every 100 iterations:
for($i = 0; $i -lt $usernames.Count; $i++) {
$user = $usernames[$i]
if($i % 100 -eq 0){
$percent = ($i / $usernames.Count) * 100
Write-Progress -Id 3 -Activity "Finding Inactive Accounts" -PercentComplete $percent
}
# ...
}
... |Where-Object is also just a loop
Now for the second one:
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
... 6000 times, powershell has to enumerate all 18000 objects in $AllUsers and test them for the Where-Object filter.
Instead of using an array and Where-Object, consider loading all users into a hashtable:
# Only need to run this once, before the loop
$AllLogonsTable = #{}
$AllUsers |ForEach-Object {
# Check if the hashtable already contains an item associated with the user name
if(-not $AllLogonsTable.ContainsKey($_.SamAccountName)){
# Before adding the first item, create an array we can add subsequent items to
$AllLogonsTable[$_.SamAccountName] = #()
}
# Add the item to the array associated with the username
$AllUsersTable[$_.SamAccountName] += $_
}
foreach($user in $users){
# This will be _much faster_ than $AllUsers |Where-Object ...
$allLogons = $AllLogonsTable[$user]
}
Hashtables have crazy-fast lookups - finding an object by key is much faster that using Where-Object on an array.

Pull MessageTrace data from O365 RestUri

I'm trying to record Message delivery/failure information in O365. I have over 250K mailboxes and the message I'm trying to trace is a global email sent to a root DL with lots of nested DLs.
I'm trying the below piece.
$Root = "https://reports.office365.com/ecp/reportingwebservice/reporting.svc/"
$Format = "`$format=JSON"
$WebService = "MessageTrace"
$Select = "`$select=RecipientAddress,Status"
$Filter = "`$filter=MessageId eq 'xxxxxxxxxxxxxxxxxxx#xxxx.xxx.xxxx.OUTLOOK.COM' and Status eq 'Failed'"
# Build report URL
$url = ($Root + $WebService + "/?" + $Select + "&" + $Filter + "&" + $Format)
$sens = $null
Do {
$sens = Invoke-RestMethod -Credential $cred -uri $url
$sens.d.results.Count
$sens.d.results | select -Last 1 -ExpandProperty RecipientAddress| ft -Wrap
if ($sens.d.__next) {
$url = ($sens.d.__next + "&" + $Format)
}
} While ($sens.d.__next -ne $null)
For sample message trace in my test domain, I should have:
19 Delivered eventd
14 Expanded events
5001 Failed events.
I hit a problem with PageSize as the default limit is 2000. Filter with Delivered & Expanded gives me accurate results as it’s completed on the first iteration.
But Failed events, as it has to break into 3 pages, is not fetching the data correctly.
If my understanding is correct, I should see 2 iterations with 2000 entries each and with __next containing $skiptoken=1999 and 3999 respectively and the last iteration with 1001 entries without __next but I keep getting __next even after the 3rd iteration and the $skiptoken keeps increasing over 10000s.
It seems to be looping with the same results.
2000
sens2500#contoso.local
2000
sens939#contoso.local
2000
sens1183#contoso.local
2000
sens214#contoso.local
2000
sens1183#contoso.local
2000
sens1423#contoso.local
I could see the results entries are not unique between 1st, 2nd, and 3rd iterations either. It seems like it just pulls a random 2000 entries on each attempt.
I tried to do an [IN] with $OrderBy=RecipientAddress to see if that makes any difference but I wasn't able to make it work successfully.
Any help on this?

Efficient way to find and replace many strings in a large text file

The Text file contains a software output on a time domain analysis. 10800 seconds simulation and 50 nodes being considered. We have 540,000 strings to be replaced in 540 MB text file with 4.5 million lines.
Which is currently projected to take more than 4 days. Something is going wrong. Don't know what. Please suggest me a better efficient approach.
Below is the function which does the find and replace.
To replace the string the script goes through the original text file line by line at the same time it generates a duplicate file with replaced strings. So another 540 MB file with 4.5 million lines will be generated at the end of the script.
Function ReplaceStringsInTextFile
{
$OutputfilebyLine = New-Object -typename System.IO.StreamReader $inputFilePathFull
$uPreviousValue = 0
$time = 60
$u = 0; $LastStringWithoutFindResult = 0
$lineNumber = 0
while ($null -ne ($line = $OutputfilebyLine.ReadLine())) {
$lineNumber = $lineNumber + 1
if ($time -le $SimulationTimeSeconds) # time simulation start and end checks
{
# 10800 strings corresponds to one node
# there are 50 nodes.. Thus 540,000 values
# $StringsToFindFileContent contains strings to find 540,000 strings
# $StringsToReplaceFileContent contains strings to replace 540,000 strings
$StringToFindLineSplit = -split $StringsToFindFileContent[$time-60]
$StringToReplaceLineSplit = -split $StringsToReplaceFileContent[$time-60]
if($u -le $NumberofNodes-1)
{
$theNode = $Nodes_Ar[$u]
$StringToFindvalue = $StringToFindLineSplit[$u]
$StringToReplacevalue = $StringToReplaceLineSplit[$u]
if (($line -match $theNode) -And ($line -match $StringToFindvalue)){
$replacedLine = $line.replace($StringToFindvalue,$StringToReplacevalue)
add-content -path $WriteOutputfilePathFull -value "$replacedLine"
$uPreviousValue = $u
$checkLineMatched = 1
if (($line -match $LastNodeInArray)) {
$time = $time + 1
$LastStringWithoutFindResult = 0
}
} elseIf (($line -match $LastNodeInArray) -And ($checkLineMatched -eq 0)) {
$LastStringWithoutFindResult = $LastStringWithoutFindResult + 1
} else {
#"Printing lines without match"
add-content -path $WriteOutputfilePathFull -value "$line"
$checkLineMatched = 0
}
}
if ($checkLineMatched -eq 1) {
# incrementing the value of node index to next one in case the last node is found
$u = $uPreviousValue + 1
if ($u -eq $Nodes_Ar.count) {
$u = 0
$timeElapsed = (get-date -displayhint time) - $startTime
"$($timeElapsed.Hours) Hours $($timeElapsed.Minutes) Minutes $($timeElapsed.Seconds) Seconds"
}
}
}
# Checking if the search has failed for more than three cycles
if ($LastStringWithoutFindResult -ge 5) { # showing error dialog in case of search error
[System.Windows.Forms.MessageBox]::Show("StringToFind Search Fail. Please correct StringToFind values. Aborting now" , "Status" , 0)
$OutputfilebyLine.close()
}
}
$OutputfilebyLine.close()
}
The above function is the last part of the script. Which is taking the most time.
I had run the script in under 10 hours 1 year ago.
Update The script sped up running after 4 hours and suddenly time to complete projection reduced from 4 days to under 3 hours. The script finished running in 7 hours and 9 minutes. However i am not sure what made the sudden change in speed other than asking the question on stack overflow :)
As per the suggestion by https://stackoverflow.com/users/478656/tessellatingheckler
I have avoided writing one line at a time using
add-content -path $WriteOutputfilePathFull -value "$replacedLine"
Instead i am now writing ten thousand lines at a time using add-content
$tenThousandLines = $tenThousandLines + "`n" + $replacedLine
And at the appropriate time I am using add-content to write 10,000 lines at one go like below. The if block follows my methods logic
if ($lineNumber/10000 -gt $tenThousandCounter){
clear-host
add-content -path $WriteOffpipeOutputfilePathFull -value "$tenThousandLines"
$tenThousandLines = ""
$tenThousandCounter = $tenThousandCounter + 1
}
I have encountered system out of memmory exception error when trying to add 15,000 or 25,000 lines at a time. After using this the time required for the operation has reduced from 7 hours to 5 hours. And at another time to 2 hours and 36 minutes.

How can you continue executing a PowerShell script, only if it has been called 3 times in 1 minute?

I have a script that is being called via a Windows Scheduled Task, and that task is triggered based on a certain Windows Application Event. It is only critical to execute the script, though, if the event occurs 3 or more times in 1 minute; if the event occurs once a minute, no action should be taken.
I know this can be handled in the script itself. Let's say there are at least 2 new variables I will need:
# time window, in seconds
$maxTime = 60
# max number of times this script needs to be called, within $maxTime window,
# before executing the rest of the script
$maxCount = 3
I started outlining an algorithm using a temp file as tracking, but thought there might be a simpler solution that someone can show me. Thanks
You could store your execution times in an environment variable.
Before this script will work, you must create the LastExecutionTimes environment variable.
$maxTime = 60
$maxCount = 3
$now = Get-Date
# Get execution times within the time limit.
$times = #($env:LastExecutionTimes -split ';'|
Where-Object {$_ -and $now.AddSeconds(-1 * $maxTime) -lt $_})
$times += '{0:yyyy-MM-dd HH:mm:ss}' -f $now
$env:LastExecutionTimes = $times -join ';'
if($times.Length -lt $maxCount) {return}
# Reset the execution times
$env:LastExecutionTimes =''
Write-Host 'Continue Script' -ForegroundColor Yellow
I would write a text file and a secondary script or function to check it. Where essentially it will call it each time, and then writes the information writes to a text file at call time.
The something like this:
if(!((Get-Date).AddMinutes(-1) -lt $oldTime))
{
$CurDate = Get-Date
"$CurDate, 1" | out-File "TheCheck.txt"
}
else
{
$counter++
if($counter -ge 3) {Call WorkerFunction}
else{
"$oldTime, $counter" | Out-File "TheCheck.txt"
}
Its missing some variables, but overall should be functional as a supplemental script. Then what your scheduled task actually does is call this, if the time since the $oldTime is over 1 minute, then it over writes the file with the current time and 1 for a $counter variable. If its less than a minute since the first call it then checks the $counter and if it is 3 or higher (could also do -eq ) to 3 then it calls your main script.