I have a PowerShell script that is modifying multiple files. I would like to verify that they were modified by checking the last write time property and comparing it to the current time minus 30 minutes.
Is there anyway to get the average time from multiple different files?
For example:
$Var = Get-Childitem -path "C:\users\User\Documents\*.txt"
$lastwt = $var.Lastwritetime
If($lastwt -ge (Get-Date).addminutes(-30)){
Do stuff
}
The above won't work because multiple dates are returned all around the same time give or take a few milliseconds.
I want to just get the average of all the times and use that as time comparison instead. Any way to do this?
About
So you should probably use New-Timespan to do time comparisons. So your update code:
Code
$Files = Get-Childitem -path "C:\users\User\Documents*.txt"
$Files | ? {
# Filter by a timespan criteria (last write on this file is greater than 30 minutes before now)
$Mins = New-Timespan $_.LastWriteTime (Get-Date) | % TotalMinutes
return $Mins -ge 30
} | % {
# Work only on files that matched the top criteria
}
Does that help? If you still want the averaging solution, lmk, I'll add it in :)
To get an average (median) LastWriteTime [DateTime] object of a series of files, this may be what you want:
$files = Get-Childitem -Path 'C:\users\User\Documents' -Filter '*txt' -File
# get an array of the LastWriteTime properties and sort that
$dates = $files | Select-Object -ExpandProperty LastWriteTime | Sort-Object
$oldest = $dates[0]
$newest = $dates[-1]
# create a new DateTime object that holds the middle of the oldest and newest file time
$average = [datetime]::new((($oldest.Ticks + $newest.Ticks) / 2), 'Local')
# show what you've got:
Write-Host "Oldest LastWriteTime: $oldest"
Write-Host "Average LastWriteTime: $average" -ForegroundColor Green
Write-Host "Newest LastWriteTime: $newest"
Related
I am creating a backup and restore tool with powershell script and I am trying to make it so that when restoring, the script picks the last folder created and restores from that directory structure. Basically I am having the script start with making a backup directory with a date/time stamp like so:
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$folderdate = Get-Date -f MMddyyyy_Hm
$homedir = Get-Aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty
HomeDirectory
New-Item $homedir -Name "TXBackup\$folderdate" -ItemType Directory
$cbookmarks = "$env:userprofile\Appdata\Local\Google\Chrome\User Data\Default\Bookmarks"
md $homedir\TXBackup\$folderdate\Chrome
Copy-Item $cbookmarks "$homedir\TXBackup\$folderdate\Chrome" -Recurse
Backup Folder Structure
Basically everytime someone runs the backup tool it will create a subfolder under the Backup directory with the date/time name to track the latest one. The problem comes when I want to restore from the last one create I can no longer use a $folderdate variable since it will pull the whatever the time is while the tool is being run. Here is the code without taking into account what the last folder is. I tried using sort but that doesn't appear to give me a clear path to select the last one created or I just am such a noob I didn't use it right :(
##Restoring Files from Backup
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$homedir = get-aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty HomeDirectory
##Restoring Chrome Bookmarks
Sort-Object -Property LastWriteTime |
Select-Object -Last 1
$rbookmarks = "$homedir\TXBackup\$folderdate\Chrome\Bookmarks"
Copy-Item $rbookmarks "C:\Test\"
I know I didn't use that correctly but any direction would be awesome for this newbie :)
You can use Sort-Object with a script block and use [DateTime] methods to parse the date from the folder name, using the same format string you used to create them.
# Sort directory names descending
Get-ChildItem -Directory | Sort-Object -Desc {
# Try to parse the long format first
$dt = [DateTime]::new( 0 )
if( [DateTime]::TryParseExact( $_.Name, 'MMddyyyy_HHmm', [CultureInfo]::InvariantCulture, [Globalization.DateTimeStyles]::none, [ref] $dt ) ) {
return $dt
}
# Fallback to short format
[DateTime]::ParseExact( $_.Name, 'MMddyyyy', [CultureInfo]::InvariantCulture )
} | Select-Object -First 1 | ForEach-Object Name
Note: I've changed the time format from Hm to HHmm, because Hm would cause a parsing ambiguity, e. g. 01:46 would be formatted as 146, but parsed as 14:06.
Also I would move the year to the beginning, e. g. 20220821_1406, so you could simply sort by name, without having to use a script block. But that is not a problem, just an (in)convenience and you might have a reason to put the year after the day.
Given these folders:
08212022
08222022
08222022_1406
08222022_1322
08222022_1324
08222022_1325
08222022_1343
The code above produces the following output:
08222022_1406
To confirm the ordering is correct, I've removed the Select-Object call:
08222022_1406
08222022_1343
08222022_1325
08222022_1324
08222022_1322
08222022
08212022
Note that the ordering is descending (-Desc), so Select-Object -First 1 can be used to more effectively select the latest folder.
I want to save in my computer "C:\logFiles" a specific date for logfile generated by program in another PC,
path that i will get from it the log file is "C:\Sut\Stat\03-2021.log"
Example : this file "C:\Sut\Stat\03-2021.Sutwin.log" contenant all the log of Mars month but i just want to get the log of last 7 Days from 19-03-2021 to 26-03-2021
I found this script in the internet but i doesn't work for me i need some help:
Example of the file .log in the photo attached:
Rest of image for the first screenshot :
my PC name : c01234
name of PC contenant log file : c06789
file that i will get from it the infos : 03-2021.Sutwin.log (exist in pc c06789)
i want to transfer the contents of just last 7 days in a folder in my PC c01234 with name Week11_LogFile
$log = "2015-05-09T06:39:34 Some information here
2015-05-09T06:40:34 Some information here
" -split "`n" | Where {$_.trim()}
#using max and min value for the example so all correct dates will comply
$upperLimit = [datetime]::MaxValue #replace with your own date
$lowerLimit = [datetime]::MinValue #replace with your own date
$log | foreach {
$dateAsText = ($_ -split '\s',2)[0]
try
{
$date = [datetime]::Parse($dateAsText)
if (($lowerLimit -lt $date) -and ($date -lt $upperLimit))
{
$_ #output the current item because it belongs to the requested time frame
}
}
catch [InvalidOperationException]
{
#date is malformed (maybe the line is empty or there is a typo), skip it
}
}
Based on your images, your log files look like simple tab-delimited files.
Assuming that's the case, this should work:
# Import the data as a tab-delimited file and add a DateTime column with a parsed value
$LogData = Import-Csv $Log -Delimiter "`t" |
Select-Object -Property *, #{n='DateTime';e={[datetime]::ParseExact($_.Date + $_.Time, 'dd. MMM yyHH:mm:ss', $null)}}
# Filter the data, drop the DateTime column, and write the output to a new tab-delimited file
$LogData | Where-Object { ($lowerLimit -lt $_.DateTime) -and ($_.DateTime -lt $upperLimit) } |
Select-Object -ExcludeProperty DateTime |
Export-Csv $OutputFile -Delimiter "`t"
The primary drawback here is that on Windows Powershell (v5.1 and below) you can only export the data quoted. On Powershell 7 and higher you can use -UseQuotes Never to prevent the fields from being double quote identified if that's important.
The only other drawback is that if these log files are huge then it will take a long time to import and process them. You may be able to improve performance by making the above a one-liner like so:
Import-Csv $Log -Delimiter "`t" |
Select-Object -Property *, #{n='DateTime';e={[datetime]::ParseExact($_.Date + $_.Time, 'dd. MMM yyHH:mm:ss', $null)}} |
Where-Object { ($lowerLimit -lt $_.DateTime) -and ($_.DateTime -lt $upperLimit) } |
Select-Object -ExcludeProperty DateTime |
Export-Csv $OutputFile -Delimiter "`t"
But if the log files are extremely large then you may run into unavoidable performance problems.
It's a shame your example of a line in the log file does not reveal the exact date format.
2015-05-09 could be yyyy-MM-dd or yyyy-dd-MM, so I'm guessing it's yyyy-MM-dd in below code..
# this is the UNC path where the log file is to be found
# you need permissions of course to read that file from the remote computer
$remotePath = '\\c06789\C$\Sut\Stat\03-2021.log' # or use the computers IP address instead of its name
$localPath = 'C:\logFiles\Week11_LogFile.log' # the output file
# set the start date for the week you are interested in
$startDate = Get-Date -Year 2021 -Month 3 -Day 19
# build an array of formatted dates for an entire week
$dates = for ($i = 0; $i -lt 7; $i++) { '{0:yyyy-MM-dd}' -f $startDate.AddDays($i) }
# create a regex string from that using an anchor '^' and the dates joined with regex OR '|'
$regex = '^({0})' -f ($dates -join '|')
# read the log file and select all lines starting with any of the dates in the regex
((Get-Content -Path $remotePath) | Select-String -Pattern $regex).Line | Set-Content -Path $localPath
My objective is to write a script that examines log files for the duration of an event, calculates the duration based on log entries (start/finish), and then calculates the average of those durations over the last 24 hours and determines whether it is greater than a certain value (let's use 2 hours for an example). So far, I have the first two portions completed, it is examining the logs properly and calculating the duration for each applicable log. I just don't know where to begin with the last step, the averaging of the durations from all of the logs. Below is my code thus far.
$imagesuccess = Get-ChildItem '\\server\osd_logs\success' -Directory |
Where-Object {
($_.Name -like "P0*") -or (($_.Name -like "MININT*") -and
(Test-Path "$($_.FullName)\SCCM_C\Logs\SMSTSLog\Get-PSPName.log")) -and
($_.LastWriteTime -gt (Get-Date).AddHours(-24))
}
$sccmlogpaths = "\\s0319p60\osd_logs\success\$($imagesuccess)\SCCM_C\Logs\SMSTSLog\smsts.log"
foreach ($sccmlogpath in $sccmlogpaths) {
$imagestartline = Select-String -Pattern "<![LOG[New time:" -Path $sccmlogpath -SimpleMatch
$imagestarttime = $imagestartline.ToString().Substring(75, 8)
$imagefinishline = Select-String -Pattern "<![LOG[ Directory: M:\$($imagesuccess)" -Path $sccmlogpath -SimpleMatch
$imagefinishtime = $imagefinishline.ToString().Substring(71, 8)
$imageduration = New-TimeSpan $imagestarttime $imagefinishtime
$imagedurationstring = $imageduration.ToString()
}
Roughly you'd do this:
$durations = foreach ($sccmlogpath in $sccmlogpaths) {
# [snip]
$imageduration = New-TimeSpan $imagestarttime $imagefinishtime
$imageduration # the 'output' of the foreach () {}
}
# $durations is now an array of timespans
$measurements = $durations | Measure-Object -Average -Property TotalHours
$averageHours = $measurements.Average
if (2.5 -lt $averageHours) {
# code here
}
This does sum(n)/count(n) averaging.
NB. if you are querying the last -24 hours, New-TimeSpan won't work nicely if any of the durations cross midnight; it will see 23:01 -> 00:01 as -23 hours.
I want to change the creation date on a various folders/files recursively. I have managed to get a simple powershell script to do this. However, the log file that is created only says true on several lines, depending on how many changes where made. What I would like is for the log file to list the file path and name of the file that was actually changed.
Below is the simple script I have that does the change but no details log file:
Get-ChildItem -recurse G:\ | % {$_.CreationTime = '10/10/2014 15:00'} | Out-File "c:\pslog.txt"
Please help as I am very new to powershell so the simpler the code the better.
Regards,
Mark
When you execute $_.CreationTime = '10/10/2014 15:00', the status of the operation is returned, so a bunch of Trues just means that the new CreationTime assignment succeeded
To get the file path, hide the assignment and drop $_.FullName into the pipeline:
Get-ChildItem -recurse G:\ | % {($_.CreationTime = '10/10/2014 15:00')|Out-Null; $_.FullName } | Out-File "C:\pslog.txt"
However, it might be useful to get the FileName along with the result, so that you can assess whether some of the files failed the assignment
You say you'd like some simple code, but compact one-liners does not equal "simple", and you might find that you learn a lot more from verbose code.
Let's split it into multiple statements for a better overview:
# Get the files
$gFiles = Get-ChildItem -recurse G:\
# Loop through them all
$gFiles | ForEach-Object {
# Set the creation date without returning any output
($_.CreationTime = '10/10/2014 15:00') |Out-Null
# Test if the previous operation was successful:
if($?)
{
# Success, create an object containing the Path and status
New-Object PSObject -Property #{
"FilePath" = $_.FullName
"FileSize" = $_.Length
"Result" = "Success"
}
}
else
{
# Success, create an object containing the Path and status
New-Object PSObject -Property #{
"FilePath" = $_.FullName
"FileSize" = $_.Length
"Result" = "Failed"
}
}
# Export the objects containing the result to a .CSV file
} |Export-Csv -LiteralPath "C:\pslog.csv" -Delimiter ";" -NoTypeInformation -Force
Now, C:\pslog.csv contains two semicolon-separated columns with appropriate headers for "Result" and "FilePath"
The FileSize property will be in number of bytes, but you could change it to KB or MB with:
"FileSize" = $_.Length / 1KB
"FileSize" = $_.Length / 1MB
I am parsing through a directory with multiple sub-directories and want to compare the LastAccessed time with the get-date time to see if the file has been accessed since yesterday, and based on that I will either delete the file or leave it alone. I have tried piping the get-date results out to a text file and pull it back as a string, I have tried wildcard I have even gone as far as using the -like as opposed to -eq in order to get the comparison to work, but it is not properly comparing the data. Any help would be greatly appreciated.
Here is my current code:
$servers="servera","serverb"
$date3=get-date -Format d
foreach($a in $servers){
$CTXGPDir="\C$\ProgramData\Citrix\GroupPolicy"
$CTXGPDirFP="\\"+"$a"+"$CTXGPDir"
$CTXGPUserDirstoRM=Get-ChildItem "$CTXGPDirFP"|where-Object{$_.Name -notlike "*.gpf"}
foreach($i in $CTXGPUserDirstoRM){
$datestring="$date3"+" *"
$CTXUserGPPath="\C$\ProgramData\Citrix\GroupPolicy\$i"
$CTXUserGPFP="\\"+"$a"+"$CTXUserGPPath"
$file=get-item $CTXUserGPFP
$isFileInactive=$file|select-object -expandproperty LastAccessTime
write-host $file
write-host $isFileInactive
write-host $datestring
if($isFileInactive -like "$datestring *"){write-host "$CTXUserGPFP on $a has lastwritetime of $isFileInactive and should NOT BE deleted"}
if($isFileInactive -notlike "$datestring *"){write-host "$CTXUserGPFP on $a has lastwritetime of $isFileInactive and SHOULD BE deleted"}
}
Your date comparison is deeply flawed.
get-date -format d returns a String representing the current date based on your regional settings.
get-childitem <file> | select -expandproperty lastaccesstime returns a DateTime object, which gets formatted as a "long" date/time using your regional settings.
To compare these two dates effectively, you need to convert the latter to the same format.
$isFileInactive=($file|select-object -expandproperty LastAccessTime).ToShortDateString()
$isFileInactive is now a String formatted the same as you get with get-date -format d and you can make a proper comparison.
if($isFileInactive -eq $datestring){write-host "$CTXUserGPFP on $a has lastwritetime of $isFileInactive and should NOT BE deleted"}
If you have to deal with timezones, you may want to amend it to add .ToLocalTime() before ToShortDateString();