Compare-Object Delete File if file does not exist on source - powershell

I have this PowerShell code that compares 2 directories and removes files if the files no longer exist in the source directory.
For example say I have Folder 1 & Folder 2. I want to compare Folder 1 with Folder 2, If a file doesn't exist anymore in Folder 1 it will remove it from Folder 2.
this code works ok but I have a problem where it also picks up file differences on the date/time. I only want it to pick up a difference if the file doesn't exist anymore in Folder 1.
Compare-Object $source $destination -Property Name -PassThru | Where-Object {$_.SideIndicator -eq "=>"} | % {
if(-not $_.FullName.PSIsContainer) {
UPDATE-LOG "File: $($_.FullName) has been removed from source"
Remove-Item -Path $_.FullName -Force -ErrorAction SilentlyContinue
}
}
Is there an extra Where-Object {$file1 <> $file2} or something like that.?

I am not sure how you are getting the information for $source and $destination I am assuming you are using Get-ChildItem
What i would do to eliminate the issue with date/time would be to not capture it in these variables. For Example:
$source = Get-ChildItem C:\temp\Folder1 -Recurse | select -ExpandProperty FullName
$destination = Get-ChildItem C:\temp\Folder2 -Recurse | select -ExpandProperty FullName
By doing this you only get the FullName Property for each object that is a child item not the date/time.
You would need to change some of the script after doing this for it to still work.

If I am not getting it wrong, the issue is your code is deleting the file with different time-stamp as compared to source:
Did you try -ExcludeProperty?
$source = Get-ChildItem "E:\New folder" -Recurse | select -ExcludeProperty Date

The following script can serve your purpose
$Item1=Get-ChildItem 'SourcePath'
$Item2=Get-ChildItem 'DestinationPath'
$DifferenceItem=Compare-Object $Item1 $Item2
$ItemToBeDeleted=$DifferenceItem | where {$_.SideIndicator -eq "=>" }
foreach ($item in $ItemToBeDeleted)
{
$FullPath=$item.InputObject.FullName
Remove-Item $FullPath -Force
}

Try something like this
In PowerShell V5:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 -file).Name
gci $yourdir2 -file | where Name -notin $filesnamedir1| remove-item
In old PowerShell:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 | where {$_.psiscontainer -eq $false}).Name
gci $yourdir2 | where {$_.psiscontainer -eq $false -and $_.Name -notin $filesnamedir1} | remove-item
If you want to compare files in multiple dir, use the -recurse option for every gci command.

Related

Copy folders from server to another - Powershell

I am trying to come up with a script to copy folders from one server to another. I might be going about this wrong, but I'm try to copy the directories from one server into an array, copy the directories from the second server into an array, compare them and then create the folders needed in the server that doesn't have them:
[array]$folders = Get-ChildItem -Path \\spesety01\TGT\TST\XRM\Test -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object -ExpandProperty FullName
[array]$folders2 = Get-ChildItem -Path \\sutwove02\TGT\TST\XRN -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object -ExpandProperty FullName
$folders | ForEach-Object {
if ($folders2 -notcontains "$_") {
New-Item "$_" -type directory
}
}
The issue is that the "$_" (in the ForEach loop)refers to the server in "$folders" and when I run the script, I get an error that the folder already exists. Is there some way to specify to copy the folders to the new server? I accept that my approach might be completely off on this and I might be making it harder than it needs to be.
<#
.SYNOPSIS
using path A as reference, make any sub directories that are missing in path B
#>
Param(
[string]$PathA,
[string]$PathB
)
$PathADirs = (Get-ChildItem -Path $PathA -Recurse -Directory).FullName
$PathBDirs = (Get-ChildItem -Path $PathB -Recurse -Directory).FullName
$PreList = Compare-Object -ReferenceObject $PathADirs -DifferenceObject $PathBDirs.replace($PathB,$PathA) |
Where-Object -Property SideIndicator -EQ "<=" |
Select-Object -ExpandProperty 'InputObject'
$TargetList = $PreList.Replace($PathA,$PathB)
New-Item -Path $TargetList -ItemType 'Directory'

Powershell test-path

Newbie help required here - I'm trying to search for the newest .csv file in F drive, then use Test-Path to check if that file is in the E drive. My script outputs the latest file name to screen which is correct - what I'm now trying to do is append $_latestFile.name to a Test-Path to see if this file is found in the folder in E drive.
Am I going about this the wrong way?
Thanks in advance.
$_sourcePath = "F:\"
$_destinationPath = "E:\"
$_FileType= #("*.*csv")
$_latestFile = Get-ChildItem -Recurse ($_sourcePath) -Include ($_FileType) | Sort-Object -Property $_.CreationTime | Select-Object -Last 1
$_latestFile.name
If your aim is to find a file in the $_destinationPath with the same name and modified date as the one you found on the $_sourcePath, you might do this:
$sourcePath ="F:\"
$destinationPath = "E:\"
$latestFile = Get-ChildItem -Path $sourcePath -Filter '*.csv' -File -Recurse | Sort-Object -Property $_.LastWriteTime | Select-Object -Last 1
Write-Host "Latest CSV file in $sourcePath is '$($latestFile.Name)'"
$destFile = Get-ChildItem -Path $destinationPath -Filter "$($latestFile.Name)" -File -Recurse | Where-Object { $_.LastWriteTime -eq $latestFile.LastWriteTime }
if ($destFile) {
Write-Host "Copy file found at '$($destFile.FullName)'" -ForegroundColor Green
}
else {
Write-Host "Could not find a file '$($latestFile.Name)' with the same modified date in '$destinationPath'"-ForegroundColor Red
}
I have changed the property CreationTime to LastWriteTime in order to get the most recently updated file. CreationTime gets changed when a file is copied to another disk..
Also (thanks Steven) I changed the variable names from $_varname to $varname to avoid confusion with PowerShell's $_ Automatic Variable

powershell - Replace only old files with new files in destination directory

Hello All,
I wish to replace only the old file with new file
I tried
Set-Location C:\contains_newfolder_contents\Old Folder
Get-ChildItem | ForEach-Object {
if ((Test-Path 'C:\contains_newfolder_contents\Sample Folder\$_' ) -and
(.$_.LastWriteTime -gt C:\contains_newfolder_contents\Sample Folder\$_.LastWriteTime' )) {
Copy-Item .\$_ -destination 'C:\contains_newfolder_contents\Sample Folder'
}
}
Kindly correct me!
Here's a one-line solution. I used different folder names to make the example easier to read.
Get-ChildItem C:\temp\destination|foreach-object {$sourceItem = (get-item "c:\temp\source\$($_.name)" -erroraction ignore); if ($sourceItem -and $sourceItem.LastWriteTime -gt $_.lastwritetime) {Copy-Item -path $sourceItem -dest $_.fullname -verbose}}
For each existing file, it finds the matching file in the source folder. $sourcItem will be null if there is no matching source item. It proceeds to compare the dates and copy if the source date is newer.
you can do it too :
Get-ChildItem "C:\contains_newfolder_contents\Old Folder" -file | sort LastWriteTime -Descending | select -First 1 | Copy-Item -Destination 'C:\contains_newfolder_contents\Sample Folder'
Instead of making several reads to the source, I propose you make a lookup table and then these simple commands will achieve the desired results.
$source = 'C:\temp\Source'
$destintation = 'C:\temp\Destination'
$lookup = Get-ChildItem $destintation | Group-Object -Property name -AsHashTable
Get-ChildItem -Path $source |
Where-Object {$_.lastwritetime -gt $lookup[$_.name].lastwritetime} |
Copy-Item -Destination $destintation

Find the oldest file in each subdirectory with Powershell

My company recently moved to outlook365. We are entirely VDI based so our user profiles are stored on a single server. As a result our users all now have 2+ .ost files taking up storage space on the server. I'd like to write a script to find and delete the extraneous .ost files. In addition I'd like to schedule the script to run on a monthly basis to clean up any orphaned .ost's that occur for any other reason.
I've tried a few different solutions but can't seem to find the right syntax to identify just the oldest/original .ost in each subdirectory, all attempts have identified the oldest file from the whole directory or all .ost files in the directory.
$Path = "<path>"
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
}
Inside of your loop, you could use the following to list the .ost file that has the oldest LastWriteTime value. Just add the -Descending flag to Sort-Object to list the newest file.
$FullFileName = foreach ($folder in $Subfolders) {
$Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime |
Select-Object -Property FullName -First 1
}
$FullFileName
If there is only one .ost file found in the $folder path, it will still find that file. So you will need logic to not delete when there is only one file. This does not guarantee it is the oldest file. You probably want a combination of the oldest CreationTime and newest LastWriteTime. The following will list the oldest .ost file based on CreationTime.
$FullFileName = foreach ($folder in $Subfolders) {
Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property CreationTime |
Select-Object -Property FullName -First 1
}
$FullFileName
Another issue is setting the $FullFileName variable inside of the foreach loop. This means it will be overwritten through each loop iteration. Therefore, if you retrieve the value after the loop completes, it will only have the last value found. Setting the variable to be the result of the foreach loop output will create an array with multiple values.
To only output an OST file path when there are multiple OST files, you can do something like the following:
$FullFileName = foreach ($folder in $Subfolders) {
$files = Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime -Descending
if ($files.count -ge 2) {
$files | Select-Object -Property FullName -First 1
}
$FullFileName
This one liner should do the job, keeping the ost file with the newest LastWriteTime
gci -Path $Path -directory | where {(gci -Path $_\*.ost).count -gt 1}|%{gci -Path $_\*.cmd|Sort-Object LastWriteTime -Descending|Select-Object -Skip 1|Remove-Item -WhatIf}
Longer variant follows.
$Path = '<path>'
$Ext = '*.ost'
Get-ChildItem -Path $Path -directory -Recurse |
Where-Object {(Get-ChildItem -Path "$_\$Ext" -File -EA 0).Count -gt 1} |
ForEach-Object {
Get-ChildItem -Path "$_\$Ext" -File -EA 0| Sort-Object LastWriteTime -Descending |
Select-Object -Skip 1 | Remove-Item -WhatIf
}
The first two lines evaluate folders with more than one .ost file
The next lines iterates those folders and sort them descending by LastWriteTime, skips the first (newest) and pipes the other to Remove-Item with the -WhatIf parameter to only show what would be deleted while testing.
You can of course also move them to a backup location instead.

How to search inside three paths and copy the name on a file.list

I am wondering if there is better way to make a script on PowerShell these instructions:
Search on 3 paths. Ex.
$LOGDIRS="C:\NETiKA\GED\Production\RI\log";"C:\NETiKA\GED\Test\RI\log";"C:\NETiKA\Tomcat-8.0.28\logs"
Find all files that are older than 7 days and copy on a file that I will call file.list . EX. > C:\Test\file.list
When I copied on my file.list, I need to search all the name of the files and delete them.
Apparently when you have more than thousands of file, this is the
fastest way to delete.
$LOGDIRS=C:/NETiKA/GED/Production/RI/log;C:/NETiKA/GED/Test/RI/log;C:/NETiKA/Tomcat-8.0.28/logs
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Select-Object FullName > files.list |
Foreach-Object {
if ($_.LastAccessTime -le (get-date).adddays($KEEP)) {
remove-item -recurse -force $_
}
};
Something like this should help you get started.
$path1 = "E:\Code\powershell\myPS\2018\Jun"
$path2 = "E:\Code\powershell\myPS\2018\Jun\compareTextFiles"
$path3 = "E:\Code\powershell\myPS\2018\May"
$allFiles = dir $path1, $path2, $path3 -File
$fileList = New-Item -type file file.list -Force
$keep = -7
$allFiles | foreach {
if ($_.LastAccessTime -le (Get-Date).AddDays($keep)) {
"$($_.FullName) is older than 7 days"
$_.FullName.ToString() | Out-File $fileList -Append
}
else {
"$($_.FullName) is new"
}
}
You can add deletion in the code in IF Block if you wish or check the file and do it later on. Your code has many issues which are very basic to PowerShell, e.g: once you use Select-Object the next pipeline will only receive the property you selected. You have tried using LastAccessTime in later pipe when you only selected to go ahead with FullName property.
Also, redirecting to a file and again using pipeline looks very messy.
Remove-Item accepts piped input and a
Where will filter the age
to first check what would be deleted I appended a -WhatIf to the Remove-Item
$LOGDIRS="C:\NETiKA\GED\Production\RI\log","C:\NETiKA\GED\Test\RI\log","C:\NETiKA\Tomcat-8.0.28\logs"
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Where-Object LastAccessTime -le ((get-date).AddDays($KEEP))
Remove-Item -Recurse -Force $_ -Whatif