I deploy custom code to thousands of computers and have been able to get the return code to function correctly for one or two objects in the tool I have to use to push out code. But I am looking for a way of setting up a file validator - because the Dev folks don't consistently use version numbering I have been able to use the below code to check for the date stamp for each object.
Code:
$foo1= Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | Out-String
$foo2= Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | Out-String
$foo3= Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | Out-String
$foo4= Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | Out-String
The above works but will show the object name and the last write time. I can write the exit code with this code:
if($foo1){Write-Host '0'
}else{
Write-Host '5'
Exit 5
}
Is there a way I can state if foo1 exists (i.e. not $null) then read it as a 0 and if it is null read it as a 1 and then state that $foochecksum = $foo1 + $foo2 + $foo3 + $foo4 and do the If Else cited above just once to write the exit code to my deployment tool?
Functionally what I am looking for is a way of checking multiple file date / time stamps and then if all are good passing one 0 to the If/Else statement that will write a pass or fail to my deployment tool.
I could use multiple if/else's if need be but will need to check something like 40 files and would rather not have to have 40 different IF/Else statements.
I would also love to have something that might work in PS V2 and V3 as I have a mix of 2003 and 2008 servers in prod.
Thanks,
Dwight
Use a variable to hold the "error state" of your script, and a HashTable to hold the Path and LastWriteTime values for each file that you are "testing."
$ErrorExists = $false;
# Declare some file/lastwritetime pairs
$FileList = #{
1 = #{ Path = 'C:\path\file1.exe';
LastWriteTime = '11/1/2013'; };
2 = #{ Path = 'C:\path\file2.exe';
LastWriteTime = '9/10/2013'; };
3 = #{ Path = 'C:\path\file3.exe';
LastWriteTime = '4/23/2013'; };
4 = #{ Path = 'C:\path\file4.exe';
LastWriteTime = '12/17/2012'; };
};
foreach ($File in $FileList) {
# If LastWriteTime is LESS than the value specified, raise an error
if ((Get-Item -Path $File.Path).LastWriteTime -lt $File.LastWriteTime) {
$ErrorExists = $true;
}
}
if ($ErrorExists) {
# Do something
}
Maybe something like this?
$foos = &{
Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | select -last 1
Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | select -last 1
Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | select -last 1
Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | select -last 1
}
if ($foos.count -eq 4) {Write-Host '0'}
else {Write-Host '5';Return '5'}
Related
Have folder which has backups of SQL databases with backup date in the name.
e.g. C:\Backup folder.
Example of backup files:
archive_1_01022022.bak
archive_1_02022022.bak
archive_1_03022022.bak
archive_2_01022022.bak
archive_2_02022022.bak
archive_2_03022022.bak
archive_3_01022022.bak
archive_3_02022022.bak
archive_3_03022022.bak
I need powershell script which removes all files from this directory but keeps recent ones (e.g. for last 5 days), but at the same time I need to keep at least 3 copies of each database (in case there are no backups done for more than last 5 days).
Below script removes all files and keeps recent ones for last 5 days:
$Folder = "C:\Backup"
$CurrentDate = Get-Date
$DateDel = $CurrentDate.AddDays(-5)
Get-ChildItem $Folder | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item
Above is wokring fine, but if there are no recent backups for last 10 days and if I run above code then it will remove all files in C:\Backup. For such cases I need to keep at least 3 backup files of each databases.
If I use below code (for example I have 9 different databases), then it do job:
$Folder = "C:\Backup"
Get-ChildItem $Folder | ? { -not $_.PSIsContainer } |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -Skip 27 |
Remove-Item -Force
But implementation is weird. For example if I have backups of 9 databases, then I need to provide "Select-Object -Skip" with value 27 (9 databases x skip 3 files of each database). In case I have more databases or less, then each time I need to adjust this number. How can I make "Select-Object -Skip 3" static value?
In that case, you need to test how many files with a newer or equal date compared to the reference date there are in the folder. If less than 3, sort them by the LastWriteTime property and keep the top 3. If you have enough newer files left, you can delete the old ones:
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# get a list of all backup files
$allFiles = Get-ChildItem -Path $Folder -Filter 'archive*.bak' -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
I have added the -WhatIf safety switch to both Remove-Item cmdlets, so you can first see what would happen before actualy destroying files. Once you are satisfied with what the console shows, remove those -WhatIf switches and run again
If you have 9 databases and the number in the filename after archive_ makes the distinction between those database backup files, just put the above inside a loop and adjust the -Filter:
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# loop through the 9 database files
for ($i = 1; $i -le 9; $i++) {
# get a list of all backup files per database
$allFiles = Get-ChildItem -Path $Folder -Filter "archive_$($i)_*.bak" -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
}
Ok, so now we know the example names you gave do not bare resemblance with the real names, the code could be as simple as this:
$dbNames = 'archive', 'master', 'documents', 'rb' # the names used in the backup files each database creates
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# loop through the database files
foreach ($name in $dbNames) {
# get a list of all backup files per database
$allFiles = Get-ChildItem -Path $Folder -Filter "$($name)_*.bak" -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
}
Basing on the assumption that your backups have a name convention of : DBNAME_ddMMyyyy.bak where the date correspond to the backup date, I would do something like below.
$Params = #{
MinBackupThresold = 1
MinBackupDays = 5
SimulateDeletion = $False # Set to true to perform a Remove-Item -WhatIf deletion}
$Folder = "C:\temp\test"
$CurrentDate = Get-Date
$DateDel = $CurrentDate.AddDays($Params.MinBackupDays).Date # set to midnight
$Archives = Foreach ($File in Get-ChildItem $Folder ) {
# -13 come from assuming naming convention DBName_8CharBackupDate.ext (eg: Db1_01012022.bak)
$DbNameEndIndex = $File.Name.Length - 13
# +1 since our naming convention have an underscore between db name and date.
$RawDateStr = $File.Name.Substring($DbNameEndIndex + 1 , 8)
[PSCustomObject]#{
Path = $FIle.FullName
LastWriteTime = $File.LastWriteTime
DBName = $File.Name.Substring(0, $DbNameEndIndex)
BackupDate = [datetime]::ParseExact( $RawDateStr, 'ddMMyyyy', $null)
}
}
#Here we group archives by their "dbname" so we can make sure to keep a min. backups for each.
$GroupedArchives = $Archives | Group DBName
Foreach ($Db in $GroupedArchives) {
if ($Db.Count -gt $Params.MinBackupThresold) {
$Db.Group | Sort BackupDate | Select-Object -Skip $Params.MinBackupThresold | Where-Object { $_.BackupDate -lt $DateDel } | % { Remove-Item -Path $_.Path -Force -WhatIf:$Params.SimulateDeletion }
} else {
# You could include additional checks to verify last backup, alert you if there should be more in there, etc...
}
}
Note: Using the date extracted from the filename will be more accurate than the lastwritetime, which could be updated for other reasons (Since we have it, might as well use it.)
Note 2 : Added WhatIf in the $params so you can easily switch between actual removal and simulation (Theo's answer gave me the idea of providing that switch) and his .Date to make sure the date was set to midnight instead of current time of day.
I'm currently writing a script that checks each folder in a directory for the last time a file was written to each folder. I'm having trouble figuring out how to obtain the last time a file was written to the folder, as opposed to just retrieving the folder's creation date.
I've tried using Poweshell's recursive method, but couldn't figure out how to properly set it up. Right now, the script successfully prints the name of each folder to the Excel spreadsheet, and also print the last write time of each folder, which is the incorrect information.
$row = 2
$column = 1
Get-ChildItem "C:\Users\Sylveon\Desktop\Test"| ForEach-Object {
#FolderName
$sheet.Cells.Item($row,$column) = $_.Name
$column++
#LastBackup
$sheet.Cells.Item($row,$column) = $_.LastWriteTime
$column++
#Increment to next Row and reset Column
$row++
$column = 1
}
The current state of the script prints each folder name to the report, but gives the folders creation date rather than the last time a file was written to that folder.
The following should work to get the most recent edit date of any file in the current directory.
Get-ChildItem | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 -ExpandProperty "LastWriteTime"
Get-ChildItem gets items in your directory
Sort-Object -Property LastWriteTime -Descending sorts by write-time, latest first
Select-Object -first 1 -ExpandProperty "LastWriteTime" gets the first one in the list, then gets its write-time
I made this to get the data you're trying to get. The last line gives us an empty string if the directory is empty, which is probably what's safest for Excel, but you could also default to something other than an empty string, like the directory's creation date:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
Write-Output $EditNames[$i]
Write-Output $EditTimes[$i]
}
To implement this for what you're doing, if I understand your question correctly, try the following:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
#FolderName
$sheet.Cells.Item($row, $column) = $EditNames[$i]
$column++
#LastBackup
$sheet.Cells.Item($row, $column) = $EditTimes[$i]
$row++
$column = 1
}
If you're only looking at the first level of files in each folder, you can do it using a nested loop:
$row = 2
$column = 1
$folders = Get-ChildItem $directorypath
ForEach ($folder in $folders) {
# start off with LastEdited set to the last write time of the folder itself
$LastEdited = $folder.LastWriteTime
$folderPath = $directoryPath + '\' + $folder.Name
# this 'dynamically' sets each folder's path
$files = Get-Childitem $folderPath
ForEach ($file in $files) {
if ((Get-Date $file.LastWriteTime) -gt (Get-Date $LastEdited)) {
$LastEdited = $file.LastWriteTime
}
}
$sheet.Cells.Item($row,$column) = $folder.Name
$column++
$sheet.Cells.Item($row,$column) = $LastEdited
$row++
$column = 1
}
I'm using a PowerShell script to retrieve a file from a remote directory. I only want to retrieve a file if it was modified within the last hour. I was able to get the most recent file using the following code:
$directoryInfo = $session.ListDirectory($remotePath)
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
I believe that I need to add another condition to the Where-Object clause, but I don't know the proper format. For example,
Where-Object { -Not $_.IsDirectory and <created/modified within the last hour> }
How do I do this? Is there a better/simpler way?
Extend you current where-block to check if LastWriteTime is greater (newer) than a datetime-object representing the previous hour. Ex:
$lasthour = (Get-Date).AddHours(-1)
$directoryInfo = $session.ListDirectory($remotePath)
$latest = $directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -and ($_.LastWriteTime -gt $lasthour) } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
If you want to download all files created/modified within the last hour, use:
$directoryInfo = $session.ListDirectory($remotePath)
$limit = (Get-Date).AddHours(-1)
$recentFiles =
$directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -And ($_.LastWriteTime -Gt $limit) }
foreach ($fileInfo in $recentFiles)
{
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$session.GetFiles($sourcePath, $localPath + "\*").Check()
}
Some official WinSCP .NET assembly examples used to make the code:
Downloading the most recent file
Listing files matching wildcard
I have logs that are getting written from various Linux servers to a central windows NAS server. They're in E:\log in the format:
E:\log\process1\log20140901.txt,
E:\log\process2\20140901.txt,
E:\log\process3\log-process-20140901.txt,
etc.
Multiple files get copied on a weekly basis at the same time, so created date isn't a good way to determine what the newest file is. Therefore I wrote a powershell function to parse the date out, and I'm attempting to iterate through and get the newest file in each folder, using the output of my function as the "date". I'm definitely doing something wrong.
Here's the Powershell I've written so far:
Function ReturnDate ($file)
{
$f = $file
$f = [RegEx]::Matches($f,"(\d{8})") | Select-Object -ExpandProperty Value
$sqlDate = $f.Substring(0,4) + "-" + $f.substring(4,2) + "-" + $f.substring(6,2)
return $sqlDate
}
Get-ChildItem E:\log\* |
Where {$_.PsIsContainer} |
foreach-object { Get-ChildItem $_ -Recurse |
Where {!$_.PsIsContainer} |
ForEach-Object { ReturnDate $_}|
Sort-Object ReturnDate -Descending |
Select-Object -First 1 | Select Name,ReturnDate
}
I seem to be confounding properties and causing "You cannot call a method on null-valued expression errors", but I'm uncertain what to do from here.
I suspect your $f variable is null and you're trying to invoke a method (Substring) on a null value. Try this instead:
Get-ChildItem E:\Log -File -Recurse | Where Name -Match '(\d{8})\.' |
Foreach {Add-Member -Inp $_ NoteProperty ReturnDate ($matches[1]) -PassThru} |
Group DirectoryName |
Foreach {$_.Group | Sort ReturnDate -Desc | Select -First 1}
This does require V3 or higher. If you're on V1 or V2 change it to this:
Get-ChildItem E:\Log -Recurse |
Where {!$_.PSIsContainer -and $_.Name -Match '(\d{8})\.'} |
Foreach {Add-Member -Inp $_ NoteProperty ReturnDate ($matches[1]) -PassThru} |
Group DirectoryName |
Foreach {$_.Group | Sort ReturnDate -Desc | Select -First 1}
Your code was ok for me when i tried it up until you did a select you were requesting name and returndate when those properties did not exist. Creating a custom object with those values would make your code work. Also i removed some of the logic from your pipes. End result should still work though (I just made some dummy files to test with like your examples).
Working with your original code you could have something like this. This would only work on v3 or higher. Simple changes could make it work on lower if need be. Mostly where [pscustomobject] is concerned.
Function ReturnDate ($file)
{
$f = $file
$f = [RegEx]::Matches($f,"(\d{8})") | Select-Object -ExpandProperty Value
$sqlDate = $f.Substring(0,4) + "-" + $f.substring(4,2) + "-" + $f.substring(6,2)
[pscustomobject] #{
'Name' = $file.FullName
'ReturnDate' = $sqlDate
}
}
Get-ChildItem C:\temp\E\* -Recurse |
Where-Object {!$_PSIsContainer} |
ForEach-Object{ReturnDate $_} |
Sort-Object ReturnDate -Descending |
Select-Object -First 1
The Sort-Object cmdlet supports sorting by a custom script block and will sort by whatever the script block returns. So, use a regular expression to grab the timestamp and return it.
Get-ChildItem E:\log\* -Directory |
ForEach-Object {
Get-ChildItem $_ -Recurse -File |
Sort-Object -Property {
if( $_.Name -match '(\d{8})' )
{
return $Matches[1]
}
Write-Error ('File ''{0}'' doesn't contain a timestamp in its name.' -f $_.FullName)
} |
Select-Object -Last 1 |
Select Name,ReturnDate
}
Note that Select-Object -First 1 was changed to Select-Object -Last 1, since dates would be sorted from oldest to newest.
I'll start by stating that i'm pretty new to Powershell but from what i hear it can be pretty powerful.
with that said i'll specify the problem.
i'm trying to write a Powershell script destined to run daily and check the total size of a number of specific folders, inside of each of these folders there are folders sorted by, let's say names.
what i'm aiming for as the final result is a script that checks these folder's size and if it exceeds of the defined limit i set beforehand, their content will be moved to a pre-defined destination.
The files will be moved to a folder with the same name as they were located in before.
here is where I've gotten so far:
$Folder_A = "C:\Users\location_A"
$Folder_B = "C:\Users\location_B"
$Folder_C = "C:\Users\location_C"
if((get-childitem $Folder_A , $Folder_B , $Folder_C | where {$_.name -eq name_1-or $_.name -eq $Name_2} | measure-object -property length -sum).sum -gt 30000) {write-host "success"}
OUTPUT?
You must provide a value expression on the right-hand side of the '-eq' operator.
At line:1 char:63
+ if((get-childitem $input , $Temp , $Tiffs | where {$_.name -eq <<<< I001 -or $_.name -eq I002} | measure-object -property length -sum).sum -gt 30000) {
write-host "success"}
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : ExpectedValueExpression
if anyone is able to help progress with this thing i would appriciate it alot!
thanks in advance.
You are missing a space in where closure. It should be:
where {$_.name -eq name_1 -or $_.name -eq $Name_2}
To be sure that is working corectly use () so your statement will be:
where {($_.name -eq name_1) -or ($_.name -eq $Name_2)}
UPDATE
My full (tested) script to calculate sum of selected directories:
$names = #("IIS", "IIS Express")
$folders = #("C:\Program Files", "C:\Program Files (x86)")
$x = (gci $folders | where {$names -contains $_.name})
$sum = (($x | %{(Get-ChildItem $_.FullName -recurse | Measure-Object -property length -sum)}) | Measure-Object -Property 'Sum' -Sum).sum
if($sum -gt 3000){write-host "success"}
In your case: $folders = #("C:\Program Files", "C:\Program Files (x86)") replace with $folders = #("C:\Users\location_A", "C:\Users\location_B", "C:\Users\location_C") and in names put what you need
The canonical approach for checking whether a property matches any of a number of references is to put the reference strings into an array and check if the array contains the property value. The names of the folders you want to check should also go into an array rather than single variables.
Try something like this:
$src = #(
"C:\Users\location_A",
"C:\Users\location_B",
"C:\Users\location_C"
)
$ref = #(
"name_1",
"name_2",
...
)
$found = [bool](Get-ChildItem $src `
| ? { $ref -contains $_.Name } `
| Measure-Object -property Length -sum `
| ? { $_.sum -gt 30000 })
if ($found) { write-host "success" }
You can use an array of directories for the path.
If you use -Recurse (even though there aren't any subdirectories to recurse), you can specify the list of filenames as an array argument to the -Include parameter:
$Folders = 'C:\Users\location_A','C:\Users\location_B','C:\Users\location_C'
$Files = 'name_1','name_2'
if ((((get-childitem $Folders -Recurse -Include $files | select -ExpandProperty length) | measure -sum).sum) -gt 30000)
{write-host 'Success'}