I am trying to read a CSV with a list of files including folder path and then delete them if they are older than x days.
I can do this is I list folders in the csv but cannot get it to work for just files.
CSV
FullName,,,,,,,,,,,,,
E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414\$RCUCS3H.txt,,,,,,,,,,,,,
E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414\$RWF5KKJ.txt,,,,,,,,,,,,,
E:\Account Lockout Files\Alockout.zip,,,,,,,,,,,,,
E:\Account Lockout Files\AlockoutXP.zip,,,,,,,,,,,,,
PowerShell contains.
$DatetoDelete = (Get-Date).AddDays(-3650)
Get-Content 'C:\delete\1.csv' | ForEach-Object { $_.Trim() } | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Force
When I use the script I did above it just deletes the files even if they are older, please can any one assist? thanks.
You are almost there with your code, but you are missing the actual check for the last write time, if you use Get-Item to get the properties of the file, you will then have the LastWriteTime property for you to use in your Where-Object
The below should do what you need with just one extra command and a pipe
Get-Content 'C:\delete\1.csv' | ForEach-Object {
$_.Trim()
Get-Item -Path $_ |Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Force
}
Thanks for sharing an example of the input csv file. We now can be sure it actually is csv with headers and therefore, using Get-Content is the wrong cmdlet.
Try with Import-Csv instead:
$DatetoDelete = (Get-Date).AddDays(-3650).Date # set to midnight
Import-Csv -Path 'C:\delete\1.csv' | ForEach-Object {
$file = Get-Item -Path $_.FullName -ErrorAction SilentlyContinue
if (($file) -and $file.LastWriteTime -lt $DatetoDelete) {
$file | Remove-Item -Force
}
}
For some files in the csv you may not have permissions to delete, like perhaps the ones inside the E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414..
Related
Hello awesome community :)
I have a list containing a bunch of SKU's. All the filenames of the files, that I need to copy to a new location, starts with the corresponding SKU like so
B6BC004-022_10_300_f.jpg
In this case "B6BC004" is the SKU and my txt list contains "B6BC004" along with many other SKU's.
Somewhere in the code below I know I have to define that it should search for files beginning with the SKU's from the txt file but I have no idea how to define it.
Get-Content .\photostocopy.txt | Foreach-Object { copy-item -Path $_ -Destination "Z:\Photosdestination\"}
Thanks in advance :)
If all files start with one of the SKU's, followed by a dash like in your example, this should work:
$sourceFolder = 'ENTER THE PATH WHERE THE FILES TO COPY ARE'
$destination = 'Z:\Photosdestination'
# get an array of all SKU's
$sku = Get-Content .\photostocopy.txt | Select-Object -Unique
# loop through the list of files in the source folder and copy all that have a name beginning with one of the SKU's
Get-ChildItem -Path $sourceFolder -File -Recurse |
Where-Object { $sku -contains ($_.Name -split '\s*-')[0] } |
ForEach-Object { $_ | Copy-Item -Destination $destination }
I haven't tested this so please proceed with caution!
What is does it loops through all the items in your photostocopy.txt file, searches the $source location for a file(s) with a name like the current item from your file. It then checks if any were found before outputting something to the console and possibly moving the file(s).
$source = '#PATH_TO_SOURCE'
$destination = '#PATH_TO_DESTINATION'
$photosToCopy = Get-Content -Path '#PATH_TO_TXT_FILE'
$photosToCopy | ForEach-Object{
$filesToCopy = Get-ChildItem -Path $source -File | Where-Object {$_.Name -like "$_*"}
if ($fileToCopy.Count -le 0){
Write-Host "No files could be found for: " $_
}else{
$filesToCopy | ForEach-Object{
Write-Host "Moving: " $_.Name
Copy-Item -Path $_.FullName -Destination $destination
}
}
}
Let me know how if this helps you :)
I got a script to take backup and to remove files in generations (group). I need to add some logging of which files it copies and also which ones it deletes. In all my previous scrips, I been using Out-File, but in this case for the copy I can't get it to work.
If I add it to the Copy-Item part it creates the file but it simply wont write any input. What I am missing?
#$a = Get-Date
#$a.ToUniversalTime()
foreach ($file in (Get-ChildItem -File $localpath -Recurse | Where {$_.LastWriteTime -gt (Get-Date).AddDays(-1)})) {
Copy-Item -Path $file.FullName -Destination "C:\qlikview Storage\privatedata\backup\$file.$(get-date -f yyyy-MM-dd)"
}
$Groups = Get-ChildItem -Path "C:\qlikview Storage\privatedata\backup" |
Group-Object -Property Basename |
Where-Object {$_.Count -gt 2}
foreach ($g in $Groups) {
$g.Group |
sort LastWriteTime -Descending |
select -Skip 2 |
foreach {del $_.FullName -Force}
}
The #a is for later to add timestamps the logging to see how long it takes.
Am I thinking wrong assuming Out-File is the way to go?
Add the -Verbose switch to your Copy-Item and Remove-Item commands. This will dump the copied/removed files to the verbose stream.
Afterwards you can redirect the verbose stream to the output stream (4>&1) and log it the a file.
Example :
Copy-Item... -Verbose 4>&1 | Out-file log.txt
Additional info can be found in about_Redirection.
I am using the code below to filter out files depending on the headers in the file.
It works like a charm, but I have a problem with that it takes all the files in the $InputDirectory.
I would like to limit it so it only takes files that are 1-2 hours old.
There are two ways where I can get the date for this process.
Filename contains timestamp = XXXXXXXXXXX_XXXXXXXX_valuereport_YYYYMMDDhhmmss.csv
The timestamp the file was created (please note we are talking about 800K-1M files in the directory and more is added every hour, so the fastest way would be appreciated.
So how do I insert something in my code, so it besides the header, only takes files that are <1-2 hours old?
Sorry about the code example, I am new to this site and not sure how to get it in the right order.
Nothing yet.
foreach ($FilePath in (Get-ChildItem $InputDirectory -File) | Select-Object -ExpandProperty FullName) {
$Header = Get-Content $FilePath -First 1
# test for a string in the header line that distincts it from the other files
if ($Header -match ';energy,Wh,') {
# the substring ';energy,Wh,' defines this file as a 'HeatMeter' file
Copy-Item -Path $FilePath -Destination $OutputPathHeat
} elseif ($Header -match ';fabrication-no,,inst-value,0,0,0;datetime,,inst-value,0,0,0;volume,m3') {
# the substring ';datetime,,inst-value,0,0,0;volume,m3' defines this file as a 'WaterMeter' file
Copy-Item -Path $FilePath -Destination $OutputPathWater
} else {
# if all key substrings above did not match, move to the 'Other' directory
Copy-Item -Path $FilePath -Destination $OutputPathOther
}
There are several ways to filter a directory listing. The easiest way is to pipe the result of Get-ChildItem through Where-Object like:
Get-ChildItem -Path $InputDirectory -File |
Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-2) } |
Select-Object -ExpandProperty FullName |
ForEach-Object {
$FilePath = $_
$Header = Get-Content $FilePath -First 1
# test for a string in the header line that distincts it from the other files
if ($Header -match ';energy,Wh,') {
# the substring ';energy,Wh,' defines this file as a 'HeatMeter' file
Copy-Item -Path $FilePath -Destination $OutputPathHeat
}
elseif ($Header -match ';fabrication-no,,inst-value,0,0,0;datetime,,inst-value,0,0,0;volume,m3') {
# the substring ';datetime,,inst-value,0,0,0;volume,m3' defines this file as a 'WaterMeter' file
Copy-Item -Path $FilePath -Destination $OutputPathWater
}
else {
# if all key substrings above did not match, move to the 'Other' directory
Copy-Item -Path $FilePath -Destination $OutputPathOther
}
}
It checks that the CreationTime is greater than now - 2h. Note that the last modified (LastWriteTime) timestamp may also be suitable for your use case.
I am wondering if there is better way to make a script on PowerShell these instructions:
Search on 3 paths. Ex.
$LOGDIRS="C:\NETiKA\GED\Production\RI\log";"C:\NETiKA\GED\Test\RI\log";"C:\NETiKA\Tomcat-8.0.28\logs"
Find all files that are older than 7 days and copy on a file that I will call file.list . EX. > C:\Test\file.list
When I copied on my file.list, I need to search all the name of the files and delete them.
Apparently when you have more than thousands of file, this is the
fastest way to delete.
$LOGDIRS=C:/NETiKA/GED/Production/RI/log;C:/NETiKA/GED/Test/RI/log;C:/NETiKA/Tomcat-8.0.28/logs
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Select-Object FullName > files.list |
Foreach-Object {
if ($_.LastAccessTime -le (get-date).adddays($KEEP)) {
remove-item -recurse -force $_
}
};
Something like this should help you get started.
$path1 = "E:\Code\powershell\myPS\2018\Jun"
$path2 = "E:\Code\powershell\myPS\2018\Jun\compareTextFiles"
$path3 = "E:\Code\powershell\myPS\2018\May"
$allFiles = dir $path1, $path2, $path3 -File
$fileList = New-Item -type file file.list -Force
$keep = -7
$allFiles | foreach {
if ($_.LastAccessTime -le (Get-Date).AddDays($keep)) {
"$($_.FullName) is older than 7 days"
$_.FullName.ToString() | Out-File $fileList -Append
}
else {
"$($_.FullName) is new"
}
}
You can add deletion in the code in IF Block if you wish or check the file and do it later on. Your code has many issues which are very basic to PowerShell, e.g: once you use Select-Object the next pipeline will only receive the property you selected. You have tried using LastAccessTime in later pipe when you only selected to go ahead with FullName property.
Also, redirecting to a file and again using pipeline looks very messy.
Remove-Item accepts piped input and a
Where will filter the age
to first check what would be deleted I appended a -WhatIf to the Remove-Item
$LOGDIRS="C:\NETiKA\GED\Production\RI\log","C:\NETiKA\GED\Test\RI\log","C:\NETiKA\Tomcat-8.0.28\logs"
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Where-Object LastAccessTime -le ((get-date).AddDays($KEEP))
Remove-Item -Recurse -Force $_ -Whatif
Am trying to delete files older than x days and would like to know which file is being deleted.
Am using below powershell script, it doesnt work
$limit = (Get-Date).AddDays(-365)
$path = $args[0]
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force | select Name,LastWriteTime | Export-CSV -NoTypeInformation -Path $args[1]
Am passing first argument as path where files are there.
Second argument is the output file which should contain the file and date modified values of those which gets deleted.
The above code works fine for deletion, but doesnt redirects the file names and the last modified values which got deleted.
If I use below code, it only redirects the file names and last modified values but files doesnt get deleted.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | select Name,LastWriteTime | Export-CSV -NoTypeInformation -Path $args[1] | Remove-Item -Force
Using below command to run it -
./OlderFiles_Cleansing.ps1 'C:\Dev\PS' 'C:\dev\CleanedFiles_01062016.csv'
What am I missing?
Neither the Export-Csv nor the Remove-Item Cmdlet return the collection you pipe in and so make it impossible to work with the items further in the pipeline.
You can do following though - split the command:
$filesToDelete = Get-ChildItem -Path $path -Recurse -Force -Attributes !Directory | Where-Object CreationTime -lt $limit
$filesToDelete | select Name,LastWriteTime | Export-CSV -NoTypeInformation -Path $args[1]
$filesToDelete | Remove-Item -Force
Note I have improved the way of detecting that an item is a file using the
Attributes param and so could simplify the Where pipe part