I have the following code in use:
$Folder="C:\Perflogs\BBCRMLogs" # Change the bit in the quotation marks to whatever directory you want the log file stored in
$Computer = $env:COMPUTERNAME
$1GBInBytes = 1GB
$p = "LOTS OF COUNTERS";
# If you want to change the performance counters, change the above list. However, these are the recommended counters for a client machine.
$dir = test-path $Folder
IF($dir -eq $False)
{
New-Item $Folder -type directory
$num = 0
$file = "$Folder\SQL_log_${num}.csv"
Get-Counter -counter $p -SampleInterval 2 -Continuous |
Foreach {
if ((Get-Item $file).Length -gt 1MB) {
$num +=1;$file = "$Folder\SQL_log_${num}.csv"
}
$_
} |
Export-Counter $file -Force -FileFormat CSV
}
Else
{
$num = 0
$file = "$Folder\SQL_log_${num}.csv"
Get-Counter -counter $p -SampleInterval 2 -Continuous |
Foreach {
if ((Get-Item $file).Length -gt 1MB) {
$num +=1;$file = "$Folder\SQL_log_${num}.csv"
}
$_
} |
Export-Counter $file -Force -FileFormat CSV
}
However, even when ((Get-Item $file).Length -gt 1MB) is TRUE, it doesn't increment the file up. My thought is that the Foreach loop isn't being called during each time the sample is taken, since Get-Counter is just being called once (and then is ongoing). I'm not sure what construct I should be using to make sure that it is passing through that loop. Should I isolate that particular Foreach statement out into another section, rather than relying on it being called during the get-counter? This Powershell script is being called by a Batch file and then the get-counter part runs in the background, collecting information.
The problem is that the $file variable on Export-Counter is only evaluated once when Export-Counter is executed. Pipe the results of Get-Counter to Foreach-Object and export inside of it (forcing $file to re-evaluate) but that will overwrite the output file in each iteration and unfortunately Export-Counter doesn't have an Append switch.
Off the top of my head you could export to csv using Export-Csv, in v3 it supports appending to the file. That said, you won't get you the same csv structure.
Two more things. In the first execution of the script, the first file was not created yet and then you check for it's length. That gives file not found an error, use the ErrorAction parameter to suppress errors.
You don't need to repeat the code twice. Check if the output directory exists and create it if it doesn't exist, then continue once with the rest of the script.
$Folder = 'D:\temp'
$num = 0
$file = "$Folder\SQL_log_${num}.csv"
if( !(test-path $folder)) {New-Item $Folder -type directory}
Get-Counter -counter $p -SampleInterval 2 -Continuous | Foreach {
if ((Get-Item $file -ErrorAction SilentlyContinue ).Length -gt 1mb)
{
$num +=1
$file = "$Folder\SQL_log_${num}.csv"
}
$_
} | Foreach-Object { $_ | Export-Csv $file -Append}
Related
I am using Powershell 7.
We have the following PowerShell script that will parse some very large file.
I no longer want to use 'Get-Content' as this is to slow.
The script below works, but it takes a very long time to process even a 10 MB file.
I have about 200 files 10MB file with over 10000 lines.
Sample Log:
#Fields:1
#Fields:2
#Fields:3
#Fields:4
#Fields: date-time,connector-id,session-id,sequence-number,local-endpoint,remote-endpoint,event,data,context
2023-01-31T13:53:50.404Z,EXCH1\Relay-EXCH1,08DAD23366676FF1,41,10.10.10.2:25,195.85.212.22:15650,<,DATA,
2023-01-31T13:53:50.404Z,EXCH1\Relay-EXCH1,08DAD23366676FF1,41,10.10.10.2:25,195.85.212.25:15650,<,DATA,
Script:
$Output = #()
$LogFilePath = "C:\LOGS\*.log"
$LogFiles = Get-Item $LogFilePath
$Count = #($logfiles).count
ForEach ($Log in $LogFiles)
{
$Int = $Int + 1
$Percent = $Int/$Count * 100
Write-Progress -Activity "Collecting Log details" -Status "Processing log File $Int of $Count - $LogFile" -PercentComplete $Percent
Write-Host "Processing Log File $Log" -ForegroundColor Magenta
Write-Host
$FileContent = Get-Content $Log | Select-Object -Skip 5
ForEach ($Line IN $FileContent)
{
$Socket = $Line | Foreach {$_.split(",")[5] }
$IP = $Socket.Split(":")[0]
$Output += $IP
}
}
$Output = $Output | Select-Object -Unique
$Output = $Output | Sort-Object
Write-Host "List of noted remove IPs:"
$Output
Write-Host
$Output | Out-File $PWD\Output.txt
As #iRon Suggests the assignment operator (+=) is a lot of overhead. As well as reading entire file to a variable then processing it. Perhaps process it strictly as a pipeline. I achieved same results, using your sample data, with the code written this way below.
$LogFilePath = "C:\LOGS\*.log"
$LogFiles = Get-ChildItem $LogFilePath
$Count = #($logfiles).count
$Output = ForEach($Log in $Logfiles) {
# Code for Write-Progress here
Get-Content -Path $Log.FullName | Select-Object -Skip 5 | ForEach-Object {
$Socket = $_.split(",")[5]
$IP = $Socket.Split(":")[0]
$IP
}
}
$Output = $Output | Select-Object -Unique
$Output = $Output | Sort-Object
Write-Host "List of noted remove IPs:"
$Output
Apart from the notable points in the comments, I believe this question is more suitable to Code Review. Nonetheless, here's my take on this using the StreamReader class:
$LogFilePath = "C:\LOGS\*.log"
$LogFiles = Get-Item -Path $LogFilePath
$OutPut = [System.Collections.ArrayList]::new()
foreach ($log in $LogFiles)
{
$skip = 0
$stop = $false
$stream = [System.IO.StreamReader]::new($log.FullName)
while ($line = $stream.ReadLine())
{
if (-not$stop)
{
if ($skip++ -eq 5)
{
$stop = $true
}
continue
}
elseif ($OutPut.Contains(($IP = ($line -split ',|:')[6])))
{
continue
}
$null = $OutPut.Add($IP)
}
$stream.Close()
$stream.Dispose()
}
# Display OutPut and save to file
Write-Host -Object "List of noted remove IPs:"
$OutPut | Sort-Object | Tee-Object -FilePath "$PWD\Output.txt"
This way you can output unique IP's since it's being handled by an if statement checking against what's in $OutPut; essentially replacing Select-Object -Unique. You should see a speed increase as you're no longer adding to a fixed array (+=), and piping to other cmdlets.
You can combine File.ReadLines with Enumerable.Skip to read your files and skip their first 5 lines. This method is much faster than Get-Content. Then for sorting and getting unique strings at the same time you can use a SortedSet<T>.
You should avoid using Write-Progress as this will slow your script down in Windows PowerShell (this has been fixed in newer versions of PowerShell Core).
Do note that because you're looking to sort the result, all strings must be contained in memory before outputting to a file. This would be much more efficient if sorting was not needed, there you would use a HashSet<T> instead for getting unique values.
Get-Item C:\LOGS\*.log | & {
begin { $set = [Collections.Generic.SortedSet[string]]::new() }
process {
foreach($line in [Linq.Enumerable]::Skip([IO.File]::ReadLines($_.FullName), 5)) {
$null = $set.Add($line.Split(',')[5].Split(':')[0])
}
}
end {
$set
}
} | Set-Content $PWD\Output.txt
I looked into some of the posts here but didn't found an answer to my situation, I have the following script in which I parse through multiple servers checking the hash code of certain files, I use foreach to do this trick and wished to pass each server as a job that runs the foreach loop, otherwise it takes an awfully long time to do.
Here's the code:
$Modulos = Get-Content -Path .\mod.txt |sort-object -unique
$Orig = "H:\Redsys_Client"
$Machines = Get-Content -Path .\servers.txt |sort-object -unique
function CheckFileHash ($file, $file2 ,$cLogFile) {
$hashSrc = Get-FileHash $file -Algorithm "SHA256"
$hashDest = Get-FileHash $file2 -Algorithm "SHA256"
if ((test-path -Path $cLogFile ) -eq $False){
Add-Content -Path $cLogFile -Value "Algor; Hash_Orig; Orig; Algor; Hash_Dest; Dest; Result"
}
If ($hashSrc.Hash -ne $hashDest.Hash)
{
Add-Content -Path $cLogFile -Value "$hashSrc; $hashDest; the files are NOT EQUAL."
}
elseif ($hashSrc.Hash -eq $hashDest.Hash){
Add-Content -Path $cLogFile -Value "$hashSrc; $hashDest; the files are EQUAL."
}
}
Foreach ($Server in $Machines){
Foreach ($name in $Modulos){
CheckFileHash $orig\$name\$name.exe "\\$Server\Redsys\$name\$name.exe" ".\test.csv"
}
}
Does anyone have any idea on how to do this kind of job ? I'm afraid that it'd just mess a lot with the csv and wished to try to store them on a variable, dunno
Anyone here can help me how to repeat the code from the beginning after all iteration in foreach loop has been completed. The code below will get all the files having 'qwerty' pattern inside the file, feed the list in foreach loop and display the filename and last 10 lines on each file and terminate the code if there is no new/updated file in certain amount of time
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}
You can use this:
For ($r = 0; $r -eq NumberOfTimesYouWantToRepeat; $r++) {
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}
}
PS: Replace TheNumberOfTimesToRepeat with the number of time you want to repeat.
If I understand the question properly, you would like to test for files in a certain folder containing a certain string. For each of these files, the last 10 lines should be displayed.
The first difficulty comes from the fact that you want to do this inside a loop and test new or updated files.
That means you need to keep track of files you have already tested and only display new or updated files. The code below uses a Hashtable $alreadyChecked for that so we can test if a file is either new or updated.
If no new or updated files are found during a certain time, the code should end. To do that, I'm using two other variables: $endTime and $checkTime.
$checkTime gets updated on every iteration, making it the current time
$endTime only gets updated if files were found.
$today = (Get-Date).Date
$sourceFolder = 'D:\Test'
$alreadyChecked = #{} # a Hashtable to keep track of files already checked
$maxMinutes = 5 # the max time in minutes to perform the loop when no new files or updates are added
$endTime = (Get-Date).AddMinutes($maxMinutes)
do {
$checkTime = Get-Date
$files = Get-ChildItem -Path $sourceFolder -File |
# only files created today and that have not been checked already
Where-Object {$_.LastWriteTime -ge $today -and
(!$alreadyChecked.ContainsKey($_.FullName) -or
$alreadyChecked[$_.FullName].LastWriteTime -ne $_.LastWriteTime) } |
ForEach-Object {
$filetime = $_.LastWriteTime
$_ | Select-String -Pattern "qwerty" -SimpleMatch | # -SimpleMatch if you don't use a Regex match
Select-Object Path, FileName, #{Name = 'LastWriteTime'; Expression = { $filetime }}
}
if ($files) {
foreach ($item in $files) {
Write-Host $item.Filename -ForegroundColor Red
Write-Host (Get-content -Path $item.Path -Tail 10)
Write-Host
# update the Hashtable to keep track of files already done
$alreadyChecked[$item.Path] = $item | Select-Object FileName, LastWriteTime
Start-Sleep 1
}
# files were found, so update the time to check for no updates/new files
$endTime = (Get-Date).AddMinutes($maxMinutes)
}
# exit the loop if no new or updated files have been found during $maxMinutes time
} while ($checkTime -le $endTime)
For demo, I'm using 5 minutes to wait for the loop to expire if no new or updated files are found, but you can change that to suit your needs.
getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory.
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Get-ChildItem $inputFolder -File -Filter '*.csv' |
ForEach-Object { Import-Csv $_.FullName } |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
May be can you export and filter your files one by one and append result into your output file like this :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Remove-Item $outputFile -Force -ErrorAction SilentlyContinue
Get-ChildItem $inputFolder -Filter "*.csv" -file | %{import-csv $_.FullName | where machine_type -eq 'workstations' | export-csv $outputFile -Append -notype }
Note: The reason for not using Get-ChildItem ... | Import-Csv ... - i.e., for not directly piping Get-ChildItem to Import-Csv and instead having to call Import-Csv from the script block ({ ... } of an auxiliary ForEach-Object call, is a bug in Windows PowerShell that has since been fixed in PowerShell Core - see the bottom section for a more concise workaround.
However, even output from ForEach-Object script blocks should stream to the remaining pipeline commands, so you shouldn't run out of memory - after all, a salient feature of the PowerShell pipeline is object-by-object processing, which keeps memory use constant, irrespective of the size of the (streaming) input collection.
You've since confirmed that avoiding the aux. ForEach-Object call does not solve the problem, so we still don't know what causes your out-of-memory exception.
Update:
This GitHub issue contains clues as to the reason for excessive memory use, especially with many properties that contain small amounts of data.
This GitHub feature request proposes using strongly typed output objects to help the issue.
The following workaround, which uses the switch statement to process the files as text files, may help:
$header = ''
Get-ChildItem $inputFolder -Filter *.csv | ForEach-Object {
$i = 0
switch -Wildcard -File $_.FullName {
'*workstations*' {
# NOTE: If no other columns contain the word `workstations`, you can
# simplify and speed up the command by omitting the `ConvertFrom-Csv` call
# (you can make the wildcard matching more robust with something
# like '*,workstations,*')
if ((ConvertFrom-Csv "$header`n$_").machine_type -ne 'workstations') { continue }
$_ # row whose 'machine_type' column value equals 'workstations'
}
default {
if ($i++ -eq 0) {
if ($header) { continue } # header already written
else { $header = $_; $_ } # header row of 1st file
}
}
}
} | Set-Content $outputFile
Here's a workaround for the bug of not being able to pipe Get-ChildItem output directly to Import-Csv, by passing it as an argument instead:
Import-Csv -LiteralPath (Get-ChildItem $inputFolder -File -Filter *.csv) |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Note that in PowerShell Core you could more naturally write:
Get-ChildItem $inputFolder -File -Filter *.csv | Import-Csv |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Solution 2 :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8 # modify encoding if necessary
$Delimiter=','
#find header for your files => i take first row of first file with data
$Header = Get-ChildItem -Path $inputFolder -Filter *.csv | Where length -gt 0 | select -First 1 | Get-Content -TotalCount 1
#if not header founded then not file with sise >0 => we quit
if(! $Header) {return}
#create array for header
$HeaderArray=$Header -split $Delimiter -replace '"', ''
#open output file
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
#write header founded
$w.WriteLine($Header)
#loop on file csv
Get-ChildItem $inputFolder -File -Filter "*.csv" | %{
#open file for read
$r = New-Object System.IO.StreamReader($_.fullname, $encoding)
$skiprow = $true
while ($line = $r.ReadLine())
{
#exclude header
if ($skiprow)
{
$skiprow = $false
continue
}
#Get objet for current row with header founded
$Object=$line | ConvertFrom-Csv -Header $HeaderArray -Delimiter $Delimiter
#write in output file for your condition asked
if ($Object.machine_type -eq 'workstations') { $w.WriteLine($line) }
}
$r.Close()
$r.Dispose()
}
$w.close()
$w.Dispose()
You have to read and write to the .csv files one row at a time, using StreamReader and StreamWriter:
$filepath = "C:\Change\2019\October"
$outputfile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8
$files = Get-ChildItem -Path $filePath -Filter *.csv |
Where-Object { $_.machine_type -eq 'workstations' }
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
$skiprow = $false
foreach ($file in $files)
{
$r = New-Object System.IO.StreamReader($file.fullname, $encoding)
while (($line = $r.ReadLine()) -ne $null)
{
if (!$skiprow)
{
$w.WriteLine($line)
}
$skiprow = $false
}
$r.Close()
$r.Dispose()
$skiprow = $true
}
$w.close()
$w.Dispose()
get-content *.csv | add-content combined.csv
Make sure combined.csv doesn't exist when you run this, or it's going to go full Ouroboros.
Ok, I have a script I am writing in powershell that will delete old files in the recycle bin. I want it to delete all files from the recycle bin that were deleted more than 2 days ago. I have done lots of research on this and have not found a suitable answer.
This is what I have so far(found the script online, i don't know much powershell):
$Path = 'C' + ':\$Recycle.Bin'
Get-ChildItem $Path -Force -Recurse -ErrorAction SilentlyContinue |
#Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-3) } |
Remove-Item -Recurse -exclude *.ini -ErrorAction SilentlyContinue
It is working great with one exception, it checks the file parameter "LastWriteTime". That is awesome if the user deletes the file they same day they modify it. Otherwise it fails.
How can I modify this code so that it will check when the file was deleted, not when it was written.
-On a side note, if I run this script from an administrator account on Microsoft Server 2008 will it work for all users recycle bins or just mine?
Answer:
the code that worked for me is:
$Shell = New-Object -ComObject Shell.Application
$Global:Recycler = $Shell.NameSpace(0xa)
foreach($item in $Recycler.Items())
{
$DeletedDate = $Recycler.GetDetailsOf($item,2) -replace "\u200f|\u200e",""
$dtDeletedDate = get-date $DeletedDate
If($dtDeletedDate -lt (Get-Date).AddDays(-3))
{
Remove-Item -Path $item.Path -Confirm:$false -Force -Recurse
}#EndIF
}#EndForeach item
It works awesome for me, however 2 questions remain...How do I do this with multiple drives? and Will this apply to all users or just me?
WMF 5 includes the new "Clear-RecycleBin" cmdlet.
PS > Clear-RecycleBin -DriveLetter C:\
These two lines will empty all the files recycle bin:
$Recycler = (New-Object -ComObject Shell.Application).NameSpace(0xa)
$Recycler.items() | foreach { rm $_.path -force -recurse }
This article has answers to all your questions
http://baldwin-ps.blogspot.be/2013/07/empty-recycle-bin-with-retention-time.html
Code for posterity:
# -----------------------------------------------------------------------
#
# Author : Baldwin D.
# Description : Empty Recycle Bin with Retention (Logoff Script)
#
# -----------------------------------------------------------------------
$Global:Collection = #()
$Shell = New-Object -ComObject Shell.Application
$Global:Recycler = $Shell.NameSpace(0xa)
$csvfile = "\\YourNetworkShare\RecycleBin.txt"
$LogFailed = "\\YourNetworkShare\RecycleBinFailed.txt"
function Get-recyclebin
{
[CmdletBinding()]
Param
(
$RetentionTime = "7",
[Switch]$DeleteItems
)
$User = $env:USERNAME
$Computer = $env:COMPUTERNAME
$DateRun = Get-Date
foreach($item in $Recycler.Items())
{
$DeletedDate = $Recycler.GetDetailsOf($item,2) -replace "\u200f|\u200e","" #Invisible Unicode Characters
$DeletedDate_datetime = get-date $DeletedDate
[Int]$DeletedDays = (New-TimeSpan -Start $DeletedDate_datetime -End $(Get-Date)).Days
If($DeletedDays -ge $RetentionTime)
{
$Size = $Recycler.GetDetailsOf($item,3)
$SizeArray = $Size -split " "
$Decimal = $SizeArray[0] -replace ",","."
If ($SizeArray[1] -contains "bytes") { $Size = [int]$Decimal /1024 }
If ($SizeArray[1] -contains "KB") { $Size = [int]$Decimal }
If ($SizeArray[1] -contains "MB") { $Size = [int]$Decimal * 1024 }
If ($SizeArray[1] -contains "GB") { $Size = [int]$Decimal *1024 *1024 }
$Object = New-Object Psobject -Property #{
Computer = $computer
User = $User
DateRun = $DateRun
Name = $item.Name
Type = $item.Type
SizeKb = $Size
Path = $item.path
"Deleted Date" = $DeletedDate_datetime
"Deleted Days" = $DeletedDays }
$Object
If ($DeleteItems)
{
Remove-Item -Path $item.Path -Confirm:$false -Force -Recurse
if ($?)
{
$Global:Collection += #($object)
}
else
{
Add-Content -Path $LogFailed -Value $error[0]
}
}#EndIf $DeleteItems
}#EndIf($DeletedDays -ge $RetentionTime)
}#EndForeach item
}#EndFunction
Get-recyclebin -RetentionTime 7 #-DeleteItems #Remove the comment if you wish to actually delete the content
if (#($collection).count -gt "0")
{
$Collection = $Collection | Select-Object "Computer","User","DateRun","Name","Type","Path","SizeKb","Deleted Days","Deleted Date"
$CsvData = $Collection | ConvertTo-Csv -NoTypeInformation
$Null, $Data = $CsvData
Add-Content -Path $csvfile -Value $Data
}
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($shell)
#ScriptEnd
Had to do a bit of research on this myself, the recycle bin contains two files for every file deleted on every drive in win 10 (in win 7 files are as is so this script is too much and needs to be cut down, especially for powershell 2.0, win 8 untested), an info file created at time of deletion $I (perfect for ascertaining the date of deletion) and the original file $R, i found the com object method would ignore more files than i liked but on the up side had info i was interested in about the original file deleted, so after a bit of exploring i found a simple get-content of the info files included the original file location, after cleaning it up with a bit of regex and came up with this:
# Refresh Desktop Ability
$definition = #'
[System.Runtime.InteropServices.DllImport("Shell32.dll")]
private static extern int SHChangeNotify(int eventId, int flags, IntPtr item1, IntPtr item2);
public static void Refresh() {
SHChangeNotify(0x8000000, 0x1000, IntPtr.Zero, IntPtr.Zero);
}
'#
Add-Type -MemberDefinition $definition -Namespace WinAPI -Name Explorer
# Set Safe within deleted days and get physical drive letters
$ignoreDeletedWithinDays = 2
$drives = (gwmi -Class Win32_LogicalDisk | ? {$_.drivetype -eq 3}).deviceid
# Process discovered drives
$drives | % {$drive = $_
gci -Path ($drive+'\$Recycle.Bin\*\$I*') -Recurse -Force | ? {($_.LastWriteTime -lt [datetime]::Now.AddDays(-$ignoreDeletedWithinDays)) -and ($_.name -like "`$*.*")} | % {
# Just a few calcs
$infoFile = $_
$originalFile = gi ($drive+"\`$Recycle.Bin\*\`$R$($infoFile.Name.Substring(2))") -Force
$originalLocation = [regex]::match([string](gc $infoFile.FullName -Force -Encoding Unicode),($drive+'[^<>:"/|?*]+\.[\w\-_\+]+')).Value
$deletedDate = $infoFile.LastWriteTime
$sid = $infoFile.FullName.split('\') | ? {$_ -like "S-1-5*"}
$user = try{(gpv "HKLM:\Software\Microsoft\Windows NT\CurrentVersion\ProfileList\$($sid)" -Name ProfileImagePath).replace("$(gpv 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\ProfileList' -Name ProfilesDirectory)\",'')}catch{$Sid}
#' Various info
$originalLocation
$deletedDate
$user
$sid
$infoFile.Fullname
((gi $infoFile -force).length / 1mb).ToString('0.00MB')
$originalFile.fullname
((gi $originalFile -force).length / 1mb).ToString('0.00MB')
""
# Blow it all Away
#ri $InfoFile -Recurse -Force -Confirm:$false -WhatIf
#ri $OriginalFile -Recurse -Force -Confirm:$false- WhatIf
# remove comment before two lines above and the '-WhatIf' statement to delete files
}
}
# Refresh desktop icons
[WinAPI.Explorer]::Refresh()
This works well also as a script with the task scheduler.
Clear-RecycleBin -Force