Repeat foreach loop after all iteration completed in Powershell - powershell

Anyone here can help me how to repeat the code from the beginning after all iteration in foreach loop has been completed. The code below will get all the files having 'qwerty' pattern inside the file, feed the list in foreach loop and display the filename and last 10 lines on each file and terminate the code if there is no new/updated file in certain amount of time
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}

You can use this:
For ($r = 0; $r -eq NumberOfTimesYouWantToRepeat; $r++) {
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}
}
PS: Replace TheNumberOfTimesToRepeat with the number of time you want to repeat.

If I understand the question properly, you would like to test for files in a certain folder containing a certain string. For each of these files, the last 10 lines should be displayed.
The first difficulty comes from the fact that you want to do this inside a loop and test new or updated files.
That means you need to keep track of files you have already tested and only display new or updated files. The code below uses a Hashtable $alreadyChecked for that so we can test if a file is either new or updated.
If no new or updated files are found during a certain time, the code should end. To do that, I'm using two other variables: $endTime and $checkTime.
$checkTime gets updated on every iteration, making it the current time
$endTime only gets updated if files were found.
$today = (Get-Date).Date
$sourceFolder = 'D:\Test'
$alreadyChecked = #{} # a Hashtable to keep track of files already checked
$maxMinutes = 5 # the max time in minutes to perform the loop when no new files or updates are added
$endTime = (Get-Date).AddMinutes($maxMinutes)
do {
$checkTime = Get-Date
$files = Get-ChildItem -Path $sourceFolder -File |
# only files created today and that have not been checked already
Where-Object {$_.LastWriteTime -ge $today -and
(!$alreadyChecked.ContainsKey($_.FullName) -or
$alreadyChecked[$_.FullName].LastWriteTime -ne $_.LastWriteTime) } |
ForEach-Object {
$filetime = $_.LastWriteTime
$_ | Select-String -Pattern "qwerty" -SimpleMatch | # -SimpleMatch if you don't use a Regex match
Select-Object Path, FileName, #{Name = 'LastWriteTime'; Expression = { $filetime }}
}
if ($files) {
foreach ($item in $files) {
Write-Host $item.Filename -ForegroundColor Red
Write-Host (Get-content -Path $item.Path -Tail 10)
Write-Host
# update the Hashtable to keep track of files already done
$alreadyChecked[$item.Path] = $item | Select-Object FileName, LastWriteTime
Start-Sleep 1
}
# files were found, so update the time to check for no updates/new files
$endTime = (Get-Date).AddMinutes($maxMinutes)
}
# exit the loop if no new or updated files have been found during $maxMinutes time
} while ($checkTime -le $endTime)
For demo, I'm using 5 minutes to wait for the loop to expire if no new or updated files are found, but you can change that to suit your needs.

Related

Advance file search with powershell

I'm fairly new to Powershell and programming in general. I want to search files using Powershell having multiple conditions. I have managed to write this code
$Drives = Get-PSDrive -PSProvider 'FileSystem'
$Filename= 'Result'
$IncludeExt= '*csv,*docx'
$StartDate= '11/1/20'
$EndDate= '1/26/21'
Get-ChildItem -Path $Drives.Root -Recurse |Where-Object {$IncludeExt -match $_.Extension} | Where-Object { $_.BaseName -match $Filename} | Where-Object {$_.lastwritetime -ge $StartDate -AND $_.lastwritetime -le $EndDate} |
foreach{
$Item = $_.Basename
$Path = $_.FullName
$Type = $_.Extension
$Age = $_.CreationTime
$Path | Select-Object `
#{n="Name";e={$Item}},`
#{n="Created";e={$Age}},`
#{n="filePath";e={$Path}},`
#{n="Folder/File";e={if($Folder){"Folder"}else{$Type}}}`
}| Export-Csv D:\FFNew.csv -NoTypeInformation
This works well when the all variables are mentioned. But how do I get this to work when
Case1: If $Filename is empty then it gives all the files with the mentioned extensions and files modified in Range of dates
Case2: If $IncludeExt is left empty then it gives all files with the $Filename mentioned, currently it gives only the folders and files modified in Range of dates
Case 3: If $Filename and $IncludeExt is left empty it gives all the files modified between the $StartDate and $EndDate
Pranay,
[EDITED]
Ok, here's the revised (exact) script with notes and sample output. Note: you'll have to change the items that are specific to my machine!
$Drives = Get-PSDrive -PSProvider 'FileSystem'
$Filename = "*" #for all or "*partial name*"
$IncludeExt = $Null #for no ext. or "*.csv","*.docx",etc...
$StartDate = '01/1/2020' #to ignore this use 1/1/1920
#For latest date use below otherwise specify date.
$EndDate = (Get-Date).ToShortDateString()
#Note: below uses only 3rd drive in the array remove [2] for all.
$GCIArgs = #{Path = $Drives[2].Root
Recurse = $True
}
If ($Null -ne $IncludeExt) {
$GCIArgs.Add("Include",$IncludeExt)
}
Get-ChildItem #GCIArgs |
Where-Object {($_.BaseName -Like $Filename) -and
($_.lastwritetime -ge $StartDate) -and
($_.lastwritetime -le $EndDate) } |
foreach{
$Item = $_.Basename
$Path = $_.FullName
$Type = $_.Extension
$Type = & {if($_.PSIsContainer){"Folder"}else{$_.Extension}}
$Age = $_.CreationTime
$Path | Select-Object #{n="Name" ;e={$Item}},
#{n="Created" ;e={$Age}} ,
#{n="filePath" ;e={$Path}},
#{n="Folder/File";e={$Type}}
} | Export-Csv -LiteralPath 'G:\BEKDocs\FFNew.csv' -NoTypeInformation
Notes:
$IncludeExt is specified as $Null if it is not used and if used the list is like this ".csv",".docx"
$Filename is specified as "*" for all filenames. Also changed the test from -match to -like so partial filenames should include *, e.g. "partial name".
Notice I changed the location of the check for Extensions to use the -Include parameter of the Get-ChildItem vs checking in the Where-Object.
Changed the piping of data to successive Where-Object clauses and replaced with -and operator, same effect and more efficient.
Changed the test for Directories to use the PSIsContainer property, couldn't see where you were getting the value for $Folder.
Removed the continuation characters from the Select-Object as the comma serves that purpose and is cleaner.
Sample output on Single drive (per code shown above) with some lines hidden for space considerations but notice the last line number.
Sample output on all drives (code edited as per comment in code), again lines hidden for space but showing multiple drives and final line number.
HTH

How do I write a Powershell script that checks when the last time a file was added to a folder?

I'm currently writing a script that checks each folder in a directory for the last time a file was written to each folder. I'm having trouble figuring out how to obtain the last time a file was written to the folder, as opposed to just retrieving the folder's creation date.
I've tried using Poweshell's recursive method, but couldn't figure out how to properly set it up. Right now, the script successfully prints the name of each folder to the Excel spreadsheet, and also print the last write time of each folder, which is the incorrect information.
$row = 2
$column = 1
Get-ChildItem "C:\Users\Sylveon\Desktop\Test"| ForEach-Object {
#FolderName
$sheet.Cells.Item($row,$column) = $_.Name
$column++
#LastBackup
$sheet.Cells.Item($row,$column) = $_.LastWriteTime
$column++
#Increment to next Row and reset Column
$row++
$column = 1
}
The current state of the script prints each folder name to the report, but gives the folders creation date rather than the last time a file was written to that folder.
The following should work to get the most recent edit date of any file in the current directory.
Get-ChildItem | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 -ExpandProperty "LastWriteTime"
Get-ChildItem gets items in your directory
Sort-Object -Property LastWriteTime -Descending sorts by write-time, latest first
Select-Object -first 1 -ExpandProperty "LastWriteTime" gets the first one in the list, then gets its write-time
I made this to get the data you're trying to get. The last line gives us an empty string if the directory is empty, which is probably what's safest for Excel, but you could also default to something other than an empty string, like the directory's creation date:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
Write-Output $EditNames[$i]
Write-Output $EditTimes[$i]
}
To implement this for what you're doing, if I understand your question correctly, try the following:
$ChildDirs = Get-ChildItem | Where-Object { $_ -is [System.IO.DirectoryInfo] }
$EditNames = $ChildDirs | ForEach-Object Name
$EditTimes = $EditNames | ForEach-Object { #( (Get-ChildItem $_ | Sort-Object -Property LastWriteTime -Descending | Select-Object -first 1 LastWriteTime), '' -ne $null)[0] }
for($i=0; $i -lt $ChildDirs.Length; $i++) {
#FolderName
$sheet.Cells.Item($row, $column) = $EditNames[$i]
$column++
#LastBackup
$sheet.Cells.Item($row, $column) = $EditTimes[$i]
$row++
$column = 1
}
If you're only looking at the first level of files in each folder, you can do it using a nested loop:
$row = 2
$column = 1
$folders = Get-ChildItem $directorypath
ForEach ($folder in $folders) {
# start off with LastEdited set to the last write time of the folder itself
$LastEdited = $folder.LastWriteTime
$folderPath = $directoryPath + '\' + $folder.Name
# this 'dynamically' sets each folder's path
$files = Get-Childitem $folderPath
ForEach ($file in $files) {
if ((Get-Date $file.LastWriteTime) -gt (Get-Date $LastEdited)) {
$LastEdited = $file.LastWriteTime
}
}
$sheet.Cells.Item($row,$column) = $folder.Name
$column++
$sheet.Cells.Item($row,$column) = $LastEdited
$row++
$column = 1
}

Delete massive amount of files without running out of memory

There is COTS app we have that creates reports and never deletes it. So we need to start cleaning it up. I started doing a foreach and would run out of memory on the server (36GB) when it got up to 50ish million files. After searching it seemed you could change it like so
Get-ChildItem -path $Path -recurse | foreach {
and it won't go through memory but process each item at a time. I can get to 140 million files before I run out of memory.
Clear-Host
#Set Age to look for
$TimeLimit = (Get-Date).AddMonths(-4)
$Path = "D:\CC\LocalStorage"
$TotalFileCount = 0
$TotalDeletedCount = 0
Get-ChildItem -Path $Path -Recurse | foreach {
if ($_.LastWriteTime -le $TimeLimit) {
$TotalDeletedCount += 1
$_.Delete()
}
$TotalFileCount += 1
$FileDiv = $TotalFileCount % 10000
if ($FileDiv -eq 0 -and $TotalFileCount -ne 0) {
$TF = [string]::Format('{0:N0}', $TotalFileCount)
$TD = [string]::Format('{0:N0}', $TotalDeletedCount)
Write-Host "Files Scanned : " -ForegroundColor Green -NoNewline
Write-Host "$TF" -ForegroundColor Yellow -NoNewline
Write-Host " Deleted: " -ForegroundColor Green -NoNewline
Write-Host "$TD" -ForegroundColor Yellow
}
Is there a better way to do this? My only next thought was not to use the -Recurse command but make my own function that calls itself for each directory.
EDIT:
I used the code provided in the first answer and it does not solve the issue. Memory is still growing.
$limit = (Get-Date).Date.AddMonths(-3)
$totalcount = 0
$deletecount = 0
$Path = "D:\CC\"
Get-ChildItem -Path $Path -Recurse -File | Where-Object { $_.LastWriteTime -lt $limit } | Remove-Item -Force
Using the ForEach-Object and the pipeline should actually prevent the code from running out of memory. If you're still getting OOM exceptions I suspect that you're doing something in your code that counters this effect, which you didn't tell us about.
With that said, you should be able to clean up your data directory with something like this:
$limit = (Get-Date).Date.AddMonths(-4)
Get-ChildItem -Path $Path -Recurse -File |
Where-Object { $_.LastWriteTime -lt $limit } |
Remove-Item -Force -WhatIf
Remove the -WhatIf switch after you verified that everything is working.
If you need the total file count and the number of deleted files, add counters like this:
$totalcount = 0
$deletecount = 0
Get-ChildItem -Path $Path -Recurse -File |
ForEach-Object { $totalcount++; $_ } |
Where-Object { $_.LastWriteTime -lt $limit } |
ForEach-Object { $deletecount++; $_ } |
Remove-Item -Force -WhatIf
I don't recommend printing status information to the console when you're bulk-processing large numbers of files. The output could significantly slow down the processing. If you must have that information, write it to a log file and tail that file separately.

Using Powershell to replace multiple strings in multiple files & folders

I have a list of strings in a CSV file. The format is:
OldValue,NewValue
223134,875621
321321,876330
....
and the file contains a few hundred rows (each OldValue is unique). I need to process changes over a number of text files in a number of folders & subfolders. My best guess of the number of folders, files, and lines of text are - 15 folders, around 150 text files in each folder, with approximately 65,000 lines of text in each folder (between 400-500 lines per text file).
I will make 2 passes at the data, unless I can do it in one. First pass is to generate a text file I will use as a check list to review my changes. Second pass is to actually make the change in the file. Also, I only want to change the text files where the string occurs (not every file).
I'm using the following Powershell script to go through the files & produce a list of the changes needed. The script runs, but is beyond slow. I haven't worked on the replace logic yet, but I assume it will be similar to what I've got.
# replace a string in a file with powershell
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
Function Search {
# Parameters $Path and $SearchString
param ([Parameter(Mandatory=$true, ValueFromPipeline = $true)][string]$Path,
[Parameter(Mandatory=$true)][string]$SearchString
)
try {
#.NET FindInFiles Method to Look for file
[Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles(
$Path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,
$SearchString
)
} catch { $_ }
}
if (Test-Path "C:\Work\ListofAllFilenamesToSearch.txt") { # if file exists
Remove-Item "C:\Work\ListofAllFilenamesToSearch.txt"
}
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames1 = Search $filefolder1 $ftype
$filenames1 | Out-File "C:\Work\ListofAllFilenamesToSearch.txt" -Width 2000
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
(Get-Content "C:\Work\NumberXrefList.CSV" |where {$_.readcount -gt 1}) | foreach{
$OldFieldValue, $NewFieldValue = $_.Split("|")
$filenamelist = (Get-Content "C:\Work\ListofAllFilenamesToSearch.txt" -ReadCount 5) #|
foreach ($j in $filenamelist) {
#$testvar = (Get-Content $j )
#$testvar = (Get-Content $j -ReadCount 100)
$testvar = (Get-Content $j -Delimiter "\n")
Foreach ($i in $testvar)
{
if ($i -imatch $OldFieldValue) {
$j + "|" + $OldFieldValue + "|" + $NewFieldValue | Out-File "C:\Work\FilesThatNeedToBeChanged.txt" -Width 2000 -Append
}
}
}
}
$FileFolder = (Get-Content "C:\Work\FilesThatNeedToBeChanged.txt" -ReadCount 5)
Get-ChildItem $FileFolder -Recurse |
select -ExpandProperty fullname |
foreach {
if (Select-String -Path $_ -SimpleMatch $OldFieldValue -Debug -Quiet) {
(Get-Content $_) |
ForEach-Object {$_ -replace $OldFieldValue, $NewFieldValue }|
Set-Content $_ -WhatIf
}
}
In the code above, I've tried several things with Get-Content - default, with -ReadCount, and -Delimiter - in an attempt to avoid an out of memory error.
The only thing I have control over is the length of the old & new replacement strings file. Is there a way to do this in Powershell? Is there a better option/solution? I'm running Windows 7, Powershell version 3.0.
Your main problem is that you're reading the file over and over again to change each of the terms. You need to invert the looping of the replace terms and looping of the files. Also, pre-load the csv. Something like:
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames = gci -Path $filefolder1 -Filter $ftype -Recurse
$replaceValues = Import-Csv -Path "C:\Work\NumberXrefList.CSV"
foreach ($file in $filenames) {
$contents = Get-Content -Path $file
foreach ($replaceValue in $replaceValues) {
$contents = $contents -replace $replaceValue.OldValue, $replaceValue.NewValue
}
Copy-Item $file "$file.old"
Set-Content -Path $file -Value $contents
}

Check files on remote computers for time stamp older than X hours and export results to CSV

We are trying to run a script against a pile of remote computers to check the date stamps of files in a fixed folder that are older than say 12 hours and return the results to a CSV. The date range needs to be flexible as its a set time of 6pm yesterday which will move as the time moves on.
$computers = Get-Content -Path computers.txt
$filePath = "c:\temp\profile"
$numdays = 0
$numhours = 12
$nummins = 5
function ShowOldFiles($filepath, $days, $hours, $mins)
{
$files = $computers #(get-childitem $filepath -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Fore Red
}
}
}
Write-output $computers, $numdays, $numhours, $nummins >> computerlist.txt
You could run the follow script on all of your remote machines:
$computers = Get-Content -Path computers.txt
$logFile = "\\ServerName\C$\Logfile.txt"
$date = "12/03/2002 12:00"
$limit = Get-Date $date
$computers | %{
$filePath = "\\$_\C$\temp\profile"
$files = $null
$files = Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.CreationTime -lt $limit }
If($files -ne $null){
"-------------------------[$($_)]------------------------">> $logFile
$files | Foreach {$_.FullName >> $logFile}
}
}
This will check the folder given ($filePath) for files that are older than the limit given. Files older than the limit will have there full file path logged in the given network location $logFile.
with a small alteration to #chard earlier code I managed to get a workable solution.
The output log file only returns the files that are older than the date in the code.
this can be manipulated in Excel with other outputs for our needs.
I will try the updated code above in a bit.
$computers = Get-Content -Path "C:\temp\computers.txt"
$logFile = "\\SERVER\logs\output.txt"
$numdays = 3
$numhours = 10
$nummins = 5
$limit = (Get-Date).AddDays(-$numdays).AddHours(-$numhours).AddMinutes(-$nummins)
$computers | %{
$filePath = "\\$_\C$\temp\profile\runtime.log"
Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.LastWriteTime -lt $limit } |
foreach {"$($_)">> $logFile}
}