remove-item -force not working on download - powershell

so i had writen this script that will clear out the files in thedownload but it doesn't work
$DaysToDelete = 1
download
Get-ChildItem "C:\users\*\Downloads\*"-Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -lt $(Get-Date).AddDays(-$DaysToDelete))} |
remove-item -force -Verbose -recurse -ErrorAction SilentlyContinue

This code is valid in what it is supposed to do. If there are any errors with file access or anything else, you can inspect them via $Error
automatic variable

Related

I'm writing my first powershell script to remove old TMP/LOG files on exchange

I'm writing a custom script to keep our Exchange servers clean. It consists of several parts.
The last part is to clean TEMP folders, and it's working with no problems.
The first part is where my problem is. I want to select all .BAK .TMP and .XML files and delete them if they are over 3 days old, and select and delete all .log files if they are over 30 days old. But no files are being selected.
$Path ="$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\EmailSecurity\DebugLogs\", "$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\AntiSpam\DebugLogs\", "$env:SystemDrive\inetpub\logs", "$env:windir\System32\LogFiles"
# How long do you want to keep files by default?
$Daysback = "3"
# How long do you want to keep .log files? (Recommended 30 days at least)
$DaysbackLog = "30"
$DatetoDelete = (Get-Date).AddDays(-$Daysback)
$DatetoDeleteLog = (Get-Date).AddDays(-$DaysbackLog)
Get-ChildItem $Path -Recurse -Hidden | Where-Object {($_.extension -like ".log" -and $_.LastWriteTime -lt $DatetoDeleteLog)} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem $Path -Recurse -Hidden | Where-Object {($_.extension -like ".bak", "tmp", "xml" -and $_.LastWriteTime -lt $DatetoDelete)} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
# The following lines clears temp folder and empty folders in the temp folder.
Get-ChildItem "$env:windir\Temp", "$env:TEMP" -recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem "$env:windir\Temp", "$env:TEMP" -recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Where {$_.PSIsContainer -and #(Get-ChildItem -LiteralPath:$_.fullname).Count -eq 0} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
There are a few ways to do this, but much of it is based on personal preference and/or performance. The latter of which is not likely to be a big design factor here.
$Path = #(
"$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\EmailSecurity\DebugLogs\"
"$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\AntiSpam\DebugLogs\"
"$env:SystemDrive\inetpub\logs"
"$env:windir\System32\LogFiles"
)
# Extensions
$Extensions = "*.bak", "*.tmp", "*.xml"
# Temp folders to clean up
$Temps = "$env:windir\Temp", "$env:TEMP"
# How long do you want to keep files by default?
$Daysback = "3"
# How long do you want to keep .log files? (Recommended 30 days at least)
$DaysbackLog = "30"
$DatetoDelete = (Get-Date).AddDays(-$Daysback)
$DatetoDeleteLog = (Get-Date).AddDays(-$DaysbackLog)
Get-ChildItem $Path -Filter "*.log" -Recurse -Hidden |
Where-Object { $_.LastWriteTime -le $DatetoDeleteLog } |
Remove-Item -Force -ErrorAction SilentlyContinue -WhatIf
# > Move filtering left, which works because you are only looking for a single
# extension.
# > Change to -le to accommodate edge case where $_.LastWriteTime is right on
# the boundary.
$Extensions |
ForEach-Object{
Get-ChildItem $Path -Filter $_ -Recurse -Hidden
} |
Where-Object { $_.LastWriteTime -le $DatetoDelete } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
# Set up extensions as an array of wild card filters.
# -Filter is much faster than -Include which may be another alternative approach
Get-ChildItem $Temps -File -Recurse |
Where-Object { $_.LastWriteTime -le $DatetoDelete } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem $Temps -Directory -Recurse |
Where-Object { !$_.GetFileSystemInfos() } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
I haven't tested any of the refactor. However, the approach is to simply rerun the Get-ChildItem cmdlet for each needed scenario. In my experience that's faster than trying to use the -Include parameter to grab all the extensions in 1 shot, while still be faster and easier to read than adding to a Where{} clause to filter on extension.
In the part for clearing the temp folders. I use the .Net Method .GetFileSystemInfos() on the [System.IO.DirectoryInfo] objects returned from Get-ChildItem. The method returns an array of all child objects, so if it's null we know the folder is empty. That sounds complicated, but as you can see it significantly shrinks the code and will likely perform better. I use the -File & -Directory parameters respectively to make sure to make sure I've got the right object types.
This is a little more advanced, but another way I played with to clean up the temp folders is to use a ForEach-Object loop with 2 process blocks.
$Temps |
ForEach-Object -Process {
# 1st process block get Empty directories:
Get-ChildItem -Directory -Recurse |
Where-Object{ !$_.GetFileSystemInfos() }
}, {
# 2nd process block get files older than the boundary date.
Get-ChildItem -File -Recurse |
Where-Object { $_.LastWriteTime -le $DatetoDelete }
} |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Again untested, and I'm not sure how this will preform. Nevertheless, since I developed it thought I'd share.
Note: the -Process argument is necessary so that ForEach-Object assigns both block to process.
Check out ForEach-Object with Multiple Script Blocks for more information.

How to search inside three paths and copy the name on a file.list

I am wondering if there is better way to make a script on PowerShell these instructions:
Search on 3 paths. Ex.
$LOGDIRS="C:\NETiKA\GED\Production\RI\log";"C:\NETiKA\GED\Test\RI\log";"C:\NETiKA\Tomcat-8.0.28\logs"
Find all files that are older than 7 days and copy on a file that I will call file.list . EX. > C:\Test\file.list
When I copied on my file.list, I need to search all the name of the files and delete them.
Apparently when you have more than thousands of file, this is the
fastest way to delete.
$LOGDIRS=C:/NETiKA/GED/Production/RI/log;C:/NETiKA/GED/Test/RI/log;C:/NETiKA/Tomcat-8.0.28/logs
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Select-Object FullName > files.list |
Foreach-Object {
if ($_.LastAccessTime -le (get-date).adddays($KEEP)) {
remove-item -recurse -force $_
}
};
Something like this should help you get started.
$path1 = "E:\Code\powershell\myPS\2018\Jun"
$path2 = "E:\Code\powershell\myPS\2018\Jun\compareTextFiles"
$path3 = "E:\Code\powershell\myPS\2018\May"
$allFiles = dir $path1, $path2, $path3 -File
$fileList = New-Item -type file file.list -Force
$keep = -7
$allFiles | foreach {
if ($_.LastAccessTime -le (Get-Date).AddDays($keep)) {
"$($_.FullName) is older than 7 days"
$_.FullName.ToString() | Out-File $fileList -Append
}
else {
"$($_.FullName) is new"
}
}
You can add deletion in the code in IF Block if you wish or check the file and do it later on. Your code has many issues which are very basic to PowerShell, e.g: once you use Select-Object the next pipeline will only receive the property you selected. You have tried using LastAccessTime in later pipe when you only selected to go ahead with FullName property.
Also, redirecting to a file and again using pipeline looks very messy.
Remove-Item accepts piped input and a
Where will filter the age
to first check what would be deleted I appended a -WhatIf to the Remove-Item
$LOGDIRS="C:\NETiKA\GED\Production\RI\log","C:\NETiKA\GED\Test\RI\log","C:\NETiKA\Tomcat-8.0.28\logs"
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Where-Object LastAccessTime -le ((get-date).AddDays($KEEP))
Remove-Item -Recurse -Force $_ -Whatif

Scan C disk and copy files

I would appreciate some help here.
The Powershell script should close Outlook process which works.
Aswell as scan C disk for .pst files which works.
Copy these files to "\fileserver01\temp\test\"
Export to csv/excel list where these files where located and last write time.
Possible hide error messages for the user when running the script since it complains about not full access on a few folders when running the scan.
Code:
Get-Process outlook | Foreach-Object { $_.CloseMainWindow() }
Get-ChildItem -path c:\ -recurse -include *.pst | `
Copy-Item -destination "\\fileserver01\temp\test\" | `
Select-object fullname,lastwritetime|export-csv "\\fileserver01\temp\test\"
How should I fix the last couple of things on my list?
Thanks
First you have to use double backslash for UNC paths.
Second, the copy-item does not output anything to the pipeline, you have to use the -Passthru parameter.
Get-ChildItem -path z:\ -recurse -include *.pst -PipelineVariable source |
Copy-Item -Destination "\\path\temp" -Verbose -PassThru |
Select-Object #{n="Source";e={$source.versioninfo.filename}},fullname,lastwritetime | export-csv "\\path\temp\copy_result.csv" -Append -Verbose
I believe the issue is that after the files are copied, the object is gone from the pipeline.
This works:
Get-ChildItem -Path C:\ -Include *.pst -ErrorAction SilentlyContinue | Select-Object FullName, LastWriteTime | Export-Csv -Path "\fileserver01\temp\test\MyCSV.csv"
This doesn't directly answer the question you've asked, as #Adamar's answer appears to do just that.
however, your issue could also be resolved by querying ost/pst files from registry using a snippet like this:
(Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
which will return all of the ost/pst files the logged in user has open in outlook.
a snippet like this will then copy them all to a network share and print the logs to a file.
$FilesToCopy = (Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
$FilesToCopy | ForEach { Copy-Item -Path $_ -Destination "\\network\share" ; $_ | Out-File "\\network\share\log.txt" -Append }
This saves a LOT of time over indexing through all of the C: drive - there's also an issue where very long directory names (greater than 260 char length) are not indexed properly by Get-ChildItem - making this method a bit more reliable and appealing for a production script.
This is the final code.
Thanks everyone for your support.
#Kill Oulook process
Get-Process outlook -ErrorAction SilentlyContinue | Foreach-Object { $_.CloseMainWindow() }
#Scan for .pst files on the C disk
Get-ChildItem -path c:\ -recurse -include *.pst -ErrorAction SilentlyContinue |
#Copy located .pst files to the destination
Copy-Item -Destination "\\networkpath\home\$env:username\ComputerChange\" -Verbose -PassThru -ErrorAction SilentlyContinue |
#Log where files were located and when they were last written to.
Select-Object fullname,lastwritetime | export-csv \\networkpath\home\$env:username\ComputerChange\PSTlog.csv -Verbose
Write-Host "PST Files have successfully been copied, press any key to close" -ErrorAction SilentlyContinue
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end
So I have created a much faster script as I have excluded some systemfolders and programfiles folders where .PST files doesn't save.
I bet some of you expert can find out why this code doesn't work?
#Exclude systemfolders etc
$folders=get-childitem c:\ | where-object{$_.mode -like "d-*" -AND $_.name -notlike "windows" -AND $_.name -notlike "drivers" -AND $_.name -notlike "program files*"}
#Search thru each root folder for PST-files
$allfiles=$folders | ForEach-Object{get-childitem -Path $_.fullname -include "*.pst" -recurse -ErrorAction silentlycontinue};
$env:username
$foldertocreate="\\destination\$env:username\"
#Check if folder with username exists in the \\destination folder otherwise create folder with username.
if((Test-Path -Path $foldertocreate -PathType Container)) {write-host "Folder already created"}
else {write-host "Creating Folder", New-Item -ItemType Directory -Force -Path $foldertocreate }
#Copy .PST files which is in $allfiles to the folder created in fileshare> $foldertocreate.
#Copy .PST files in $allfiles to the destination folder created.
robocopy $allfiles $foldertocreate
Write-Host "Press any key to close" -ErrorAction SilentlyContinue $x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end

Getting error after using ErrorAction Silentlcontinue in my command

I am still getting error in my out after using -Erroraction Silentcontinue. Here is my command I am using:
get-childitem c:\ -include *.bak -recurse | foreach ($_) {remove-item $_.fullname } -ErrorAction SilentlyContinue -ErrorVariable a
You probably retrieve the error within the Get-ChildItem cmdlet. So you should add the parameter there too (-ea 0 is the alias for -ErrorAction SilentlyContinue).
Also the usage of the Foreach-Object cmdlet within your code is obsolete since the Remove-Item cmdlet takes a pipeline object:
Get-ChildItem c:\ -include *.bak -recurse -ea 0 | Remove-Item -ea 0

How to add print to console in PowerShell script?

I am new to PowerShell and I have created the following code to delete specific files and folders:
$myFolderPath = "C:\Test"
$myLimit = (Get-Date).AddDays(-14)
# Delete files according to filter.
Get-ChildItem -Path $myFolderPath -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $myLimit} | Remove-Item -Force
# Delete empty folders.
Get-ChildItem -Path $myFolderPath -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse
Is it possible to print out the full path of each item that will be removed to the console before the actual Remove-Item operation will be performed?
I guess sth. has to be added here....
... | Remove-Item -Force
and here...
... | Remove-Item -Force -Recurse
but I cannot find out how to implement that in an elegant way (without code duplication).
You can replace the remove-Item-Parts with
Foreach-Object { $_.fullname; Remove-Item -Path $_.Fullname (-Recurse) -Force}
LotPings comment might be better idea, if that is what you want.
It does not get a lot of attention but Tee-Object could be a simple addition to the pipeline here. Redirect the output to a variable that you can print later.
...Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $myLimit} |
Tee-Object -Variable removed | Remove-Item -Force
$removed | Write-Host
All of the file objects piped will be sent to $removed and then to Remove-Item. Since you have more than one delete pipeline you can also use the -Append parameter so that all files are saved in one variable if you so desired.
However this does not mean they were deleted. Just they made it passed the pipe. If you really wanted to be sure you should be using another utility like robocopy which has logging features.