PowerShell releasing the file lock created by the last criteria test - powershell

I need to select images based on size, width, and datemodified, and move them, as well as rename them. Here is my script so far:
Add-Type -Assembly System.Drawing
Get-ChildItem -path C:\temp\images |
Where-Object { $_.Length -ge 250Kb -and $_.lastwritetime -gt (get-date).addDays(-5) -and [Drawing.Image]::FromFile($_.FullName).Width -eq 1920 } |
move-item –PassThru | Rename-Item -NewName {-join #($_.Name,'.jpg')}
The problem is FromFile method is locking the file and preventing the move with this error message:
move-item : The process cannot access the file because it is being
used by another process.

You need to find the other process using the file, with for instance:
$lockedFile = "path/to/locked/file"
$lockingProcessID = Get-Process | foreach {$processVar = $_;$_.Modules `
| foreach {if($_.FileName -eq $lockedFile) {$processVar.id}}}
and then kill the process:
Stop-Process -ID $lockingProcessID -Force
then try again to move your file.

Related

I'm writing my first powershell script to remove old TMP/LOG files on exchange

I'm writing a custom script to keep our Exchange servers clean. It consists of several parts.
The last part is to clean TEMP folders, and it's working with no problems.
The first part is where my problem is. I want to select all .BAK .TMP and .XML files and delete them if they are over 3 days old, and select and delete all .log files if they are over 30 days old. But no files are being selected.
$Path ="$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\EmailSecurity\DebugLogs\", "$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\AntiSpam\DebugLogs\", "$env:SystemDrive\inetpub\logs", "$env:windir\System32\LogFiles"
# How long do you want to keep files by default?
$Daysback = "3"
# How long do you want to keep .log files? (Recommended 30 days at least)
$DaysbackLog = "30"
$DatetoDelete = (Get-Date).AddDays(-$Daysback)
$DatetoDeleteLog = (Get-Date).AddDays(-$DaysbackLog)
Get-ChildItem $Path -Recurse -Hidden | Where-Object {($_.extension -like ".log" -and $_.LastWriteTime -lt $DatetoDeleteLog)} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem $Path -Recurse -Hidden | Where-Object {($_.extension -like ".bak", "tmp", "xml" -and $_.LastWriteTime -lt $DatetoDelete)} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
# The following lines clears temp folder and empty folders in the temp folder.
Get-ChildItem "$env:windir\Temp", "$env:TEMP" -recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem "$env:windir\Temp", "$env:TEMP" -recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Where {$_.PSIsContainer -and #(Get-ChildItem -LiteralPath:$_.fullname).Count -eq 0} | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
There are a few ways to do this, but much of it is based on personal preference and/or performance. The latter of which is not likely to be a big design factor here.
$Path = #(
"$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\EmailSecurity\DebugLogs\"
"$env:SystemDrive\Program Files (x86)\GFI\MailEssentials\AntiSpam\DebugLogs\"
"$env:SystemDrive\inetpub\logs"
"$env:windir\System32\LogFiles"
)
# Extensions
$Extensions = "*.bak", "*.tmp", "*.xml"
# Temp folders to clean up
$Temps = "$env:windir\Temp", "$env:TEMP"
# How long do you want to keep files by default?
$Daysback = "3"
# How long do you want to keep .log files? (Recommended 30 days at least)
$DaysbackLog = "30"
$DatetoDelete = (Get-Date).AddDays(-$Daysback)
$DatetoDeleteLog = (Get-Date).AddDays(-$DaysbackLog)
Get-ChildItem $Path -Filter "*.log" -Recurse -Hidden |
Where-Object { $_.LastWriteTime -le $DatetoDeleteLog } |
Remove-Item -Force -ErrorAction SilentlyContinue -WhatIf
# > Move filtering left, which works because you are only looking for a single
# extension.
# > Change to -le to accommodate edge case where $_.LastWriteTime is right on
# the boundary.
$Extensions |
ForEach-Object{
Get-ChildItem $Path -Filter $_ -Recurse -Hidden
} |
Where-Object { $_.LastWriteTime -le $DatetoDelete } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
# Set up extensions as an array of wild card filters.
# -Filter is much faster than -Include which may be another alternative approach
Get-ChildItem $Temps -File -Recurse |
Where-Object { $_.LastWriteTime -le $DatetoDelete } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Get-ChildItem $Temps -Directory -Recurse |
Where-Object { !$_.GetFileSystemInfos() } |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
I haven't tested any of the refactor. However, the approach is to simply rerun the Get-ChildItem cmdlet for each needed scenario. In my experience that's faster than trying to use the -Include parameter to grab all the extensions in 1 shot, while still be faster and easier to read than adding to a Where{} clause to filter on extension.
In the part for clearing the temp folders. I use the .Net Method .GetFileSystemInfos() on the [System.IO.DirectoryInfo] objects returned from Get-ChildItem. The method returns an array of all child objects, so if it's null we know the folder is empty. That sounds complicated, but as you can see it significantly shrinks the code and will likely perform better. I use the -File & -Directory parameters respectively to make sure to make sure I've got the right object types.
This is a little more advanced, but another way I played with to clean up the temp folders is to use a ForEach-Object loop with 2 process blocks.
$Temps |
ForEach-Object -Process {
# 1st process block get Empty directories:
Get-ChildItem -Directory -Recurse |
Where-Object{ !$_.GetFileSystemInfos() }
}, {
# 2nd process block get files older than the boundary date.
Get-ChildItem -File -Recurse |
Where-Object { $_.LastWriteTime -le $DatetoDelete }
} |
Remove-Item -Recurse -Force -ErrorAction SilentlyContinue -WhatIf
Again untested, and I'm not sure how this will preform. Nevertheless, since I developed it thought I'd share.
Note: the -Process argument is necessary so that ForEach-Object assigns both block to process.
Check out ForEach-Object with Multiple Script Blocks for more information.

Delete all files and folder working very slow

I need to delete all the archived files and folder older than 15 days.
I have implemented the solution using PowerShell script but it taking more than a day to delete all files. Total size of the folder is less than 100 GB.
$StartFolder = "\\Guru\Archive\"
$deletefilesolderthan = "15"
#Get Foldernames for ForEach Loop
$SubFolders = Get-ChildItem -Path $StartFolder |
Where-Object {$_.PSIsContainer -eq "True"} |
Select-Object Name
#Loop through folders
foreach ($Subfolder in $SubFolders) {
Write-Host "Processing Folder:" $Subfolder
#For each folder recurse and delete files olders than specified number of days while the folder structure is left intact.
Get-ChildItem -Path $StartFolder$($Subfolder.name) -Include *.* -File -Recurse |
Where LastWriteTime -lt (Get-Date).AddDays(-$deletefilesolderthan) |
foreach {$_.Delete()}
#$dirs will be an array of empty directories returned after filtering and loop until until $dirs is empty while excluding "Inbound" and "Outbound" folders.
do {
$dirs = gci $StartFolder$($Subfolder.name) -Exclude Inbound,Outbound -Directory -Recurse |
Where {(gci $_.FullName).Count -eq 0} |
select -ExpandProperty FullName
$dirs | ForEach-Object {Remove-Item $_}
} while ($dirs.Count -gt 0)
}
Write-Host "Completed" -ForegroundColor Green
#Read-Host -Prompt "Press Enter to exit"
Please suggest some way to optimise the performance.
If you have many smaller files, the long delete time is not abnormal because it has to process each file descriptor. Some improvements can be made depending on your version; I'm going to assume you're on at least v4.
#requires -Version 4
param(
[string]
$start = '\\Guru\Archive',
[int]
$thresholdDays = 15
)
# getting the name wasn't useful. keep objects as objects
foreach ($folder in Get-ChildItem -Path $start -Directory) {
"Processing Folder: $folder"
# get all items once
$folders, $files = ($folder | Get-ChildItem -Recurse).
Where({ $_.PSIsContainer }, 'Split')
# process files
$files.Where{
$_.LastWriteTime -lt (Get-Date).AddDays(-$thresholdDays)
} | Remove-Item -Force
# process folders
$folders.Where{
$_.Name -notin 'Inbound', 'Outbound' -and
($_ | Get-ChildItem).Count -eq 0
} | Remove-Item -Force
}
"Complete!"
The reason why it takes so many time is that you are deleting files/folder over network which leads to need for additional network communication for every file and folder. You can easily check that fact using network analyzer. The best approach here is to use one of the method that allows to run code which executes file operations on remote machine, for example you can try to use:
WinRM
psexec (first copy code to remote machine and then execute it using psexec)
remote WMI (using CIM_Datafile)
or even adding needed task to the scheduler
I would prefer to use WinRM but psexec is also good decision (if you don't want to perform additional configuration of WinRM).

Compare-Object Delete File if file does not exist on source

I have this PowerShell code that compares 2 directories and removes files if the files no longer exist in the source directory.
For example say I have Folder 1 & Folder 2. I want to compare Folder 1 with Folder 2, If a file doesn't exist anymore in Folder 1 it will remove it from Folder 2.
this code works ok but I have a problem where it also picks up file differences on the date/time. I only want it to pick up a difference if the file doesn't exist anymore in Folder 1.
Compare-Object $source $destination -Property Name -PassThru | Where-Object {$_.SideIndicator -eq "=>"} | % {
if(-not $_.FullName.PSIsContainer) {
UPDATE-LOG "File: $($_.FullName) has been removed from source"
Remove-Item -Path $_.FullName -Force -ErrorAction SilentlyContinue
}
}
Is there an extra Where-Object {$file1 <> $file2} or something like that.?
I am not sure how you are getting the information for $source and $destination I am assuming you are using Get-ChildItem
What i would do to eliminate the issue with date/time would be to not capture it in these variables. For Example:
$source = Get-ChildItem C:\temp\Folder1 -Recurse | select -ExpandProperty FullName
$destination = Get-ChildItem C:\temp\Folder2 -Recurse | select -ExpandProperty FullName
By doing this you only get the FullName Property for each object that is a child item not the date/time.
You would need to change some of the script after doing this for it to still work.
If I am not getting it wrong, the issue is your code is deleting the file with different time-stamp as compared to source:
Did you try -ExcludeProperty?
$source = Get-ChildItem "E:\New folder" -Recurse | select -ExcludeProperty Date
The following script can serve your purpose
$Item1=Get-ChildItem 'SourcePath'
$Item2=Get-ChildItem 'DestinationPath'
$DifferenceItem=Compare-Object $Item1 $Item2
$ItemToBeDeleted=$DifferenceItem | where {$_.SideIndicator -eq "=>" }
foreach ($item in $ItemToBeDeleted)
{
$FullPath=$item.InputObject.FullName
Remove-Item $FullPath -Force
}
Try something like this
In PowerShell V5:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 -file).Name
gci $yourdir2 -file | where Name -notin $filesnamedir1| remove-item
In old PowerShell:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 | where {$_.psiscontainer -eq $false}).Name
gci $yourdir2 | where {$_.psiscontainer -eq $false -and $_.Name -notin $filesnamedir1} | remove-item
If you want to compare files in multiple dir, use the -recurse option for every gci command.

How to add print to console in PowerShell script?

I am new to PowerShell and I have created the following code to delete specific files and folders:
$myFolderPath = "C:\Test"
$myLimit = (Get-Date).AddDays(-14)
# Delete files according to filter.
Get-ChildItem -Path $myFolderPath -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $myLimit} | Remove-Item -Force
# Delete empty folders.
Get-ChildItem -Path $myFolderPath -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse
Is it possible to print out the full path of each item that will be removed to the console before the actual Remove-Item operation will be performed?
I guess sth. has to be added here....
... | Remove-Item -Force
and here...
... | Remove-Item -Force -Recurse
but I cannot find out how to implement that in an elegant way (without code duplication).
You can replace the remove-Item-Parts with
Foreach-Object { $_.fullname; Remove-Item -Path $_.Fullname (-Recurse) -Force}
LotPings comment might be better idea, if that is what you want.
It does not get a lot of attention but Tee-Object could be a simple addition to the pipeline here. Redirect the output to a variable that you can print later.
...Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $myLimit} |
Tee-Object -Variable removed | Remove-Item -Force
$removed | Write-Host
All of the file objects piped will be sent to $removed and then to Remove-Item. Since you have more than one delete pipeline you can also use the -Append parameter so that all files are saved in one variable if you so desired.
However this does not mean they were deleted. Just they made it passed the pipe. If you really wanted to be sure you should be using another utility like robocopy which has logging features.

Powershell: Script to search all user profiles and copy the most recent to all user profiles

I am looking to write a powershell script to search all user profiles on a server for a specific file, compare the files by the lastmodifieddate, and then copy the newest file to all user profiles. The script will also create a backup of the last three versions of the file.
I previously wrote this script for our pilot environment where only two people were accessing the app (this is for a XenApp), but the user base has now expanded and I would like to create the prod version of the script to cover future growth.
Any help would be very much appreciated. Thanks! Script below...
$SRC1 = "\\Server\c$\Users\XXXX1\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC2 = "\\Server\c$\Users\XXXX2\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC3 = "\\Server\c$\Users\XXXX3\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$BKU = "\\storage\IT\EMSLM\Backup"
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
else {Copy-Item $SRC2 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC3).LastWriteTime ) {Copy-Item $SRC1 $SRC3}
else {Copy-Item $SRC3 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
Remove-Item $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist_old_1.xml $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist.xml $BKU\sitelist_old_1.xml
Copy-Item $SRC1 $BKU
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit
this isn't everything, but it should be a good place to start
$users = dir "\\Server\c$\Users" -Directory | select -ExpandProperty fullname
$newest = dir "\\Server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$files = #()
$users | % {
$files += $newest -replace [regex]::Escape($_)
}
$newestEnd = $files | sort {$_.length} | select -f 1
$users | % {
$dest = Join-Path $_ $newestEnd
copy $newest $dest -force
}
Working off of Anthony Stringer's response I was able to build a script that meets my exact needs. Anthony's script would have worked, but was missing a couple things that I wanted:
1.) Identify all profiles with an existing sitelist.xml file and place in an array or hash table.
2.) Copy only to those user profiles where the sitelist.xml file existed (my fault, I never requested this in my original question)
Thank you Anthony for the starting point. Updated script below:
$Users = dir "\\server\c$\Users" -Directory -Exclude Public, Default, Administrator* | select -ExpandProperty fullname
$FilePath = "AppData\Roaming\EMIESiteListManager\sitelist.xml"
$UserPath = Join-Path -path $Users $filePath
$NewestFile = dir "\\server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$BackUp = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup"
$BackUpFile = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup\sitelest.xml"
$EMSLM_Users = #()
$UserPath | ForEach {
If ((Test-Path -path $_) -eq $true)
{$EMSLM_Users += $_}
}
$EMSLM_Users | ForEach-Object {
Copy-Item $NewestFile $_ -force -erroraction silentlycontinue
}
If ($NewestFile.lastwritetime -gt $BackUpFile.lastwritetime)
{
Remove-Item $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist_old_1.xml $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist.xml $BackUp\sitelist_old_1.xml -and Copy-Item $NewestFile $BackUp
}
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit