I've been given the task of comparing 2 folders, FolderA and FolderB and noting any files that exist in A but not in B.
Sorry for not explaining myself fully. Maybe it would help if I explain our situation. A company sales employee has left our company to go to a competitor. He has files in on his work laptop local hard drive. We are trying to establish if there are any files that exist on his computer but not on the shared network folder.
I need to produce a list of any files (along with their paths) that are present on his laptop but not on the share network location. The file structure between the laptop local hard drive and the shared network location are different. What's the best way to go about this?
$folderAcontent = "C:\temp\test1"
$folderBcontent = "C:\temp\test2"
$FolderAContents = Get-ChildItem $folderAcontent -Recurse | where-object {!$_.PSIsContainer}
$FolderBContents = Get-ChildItem $folderBcontent -Recurse | where-object {!$_.PSIsContainer}
$FolderList = Compare-Object -ReferenceObject ($FolderAContents ) -DifferenceObject ($FolderBContents) -Property name
$FolderList | fl *
Use the compare-Object cmdlet :
Compare-Object (gci $folderAcontent) (gci $folderBcontent)
if you want to list the file that are only in $folderAcontent select the results with the <= SideIndicator :
Compare-Object (gci $folderAcontent) (gci $folderBcontent) | where {$_.SideIndicator -eq "<="}
assuming that the filenames in both the directories are same, you can do something like the following :-
$folderAcontent = "C:\temp\test1"
$folderBcontent = "C:\temp\test2"
ForEach($File in Get-ChildItem -Recurse -LiteralPath $FolderA | where {$_.psIsContainer -eq $false} | Select-Object Name)
{
if(!(Test-Path "$folderBcontent\$File"))
{
write-host "Missing File: $folderBcontent\$File"
}
}
The above will only work for files (not subdirectories) present in folder A
Try:
#Set locations
$laptopfolder = "c:\test1"
$serverfolder = "c:\test2"
#Get contents
$laptopcontents = Get-ChildItem $laptopfolder -Recurse | where {!$_.PSIsContainer}
$servercontents = Get-ChildItem $serverfolder -Recurse | where {!$_.PSIsContainer}
#Compare on name and length and find changed files on laptop
$diff = Compare-Object $laptopcontents $servercontents -Property name, length -PassThru | where {$_.sideindicator -eq "<="}
#Output differences
$diff | Select-Object FullName
If you add lastwritetime after length in the compare-object cmdlet it will compare modified date too(if the file was updated but still same size). Just be aware that it only looks for differnt dates, not if it's newer or older. :)
Related
I am trying to compare files and directories by hash, and it is working, but I now need an easier way to figure out which file is in different.
I originally started without comparing the hash, and it worked for files and folders, but it would not tell me anything other than the fact that they exist.
$Source = Get-ChildItem -recurse –Path E:\path | foreach {Get-FileHash –Path $_.FullName}
$Destination = Get-ChildItem -recurse –Path "\\server\e$\path" | foreach {Get-FileHash –Path $_.FullName}
Compare-Object -ReferenceObject $Source.hash -DifferenceObject $Destination.hash
Now this works great, but I want to also list the files that are associated with the hash. After I get the hash, I then need to go back to the files and compare the hash to the original directories to figure out which one it came from.
InputObject SideIndicator
----------- -------------
CFD1DF3C08A9F7C4D81E22DA7D1CBB35FA12220C3CB85777EBA9BD89362AEDA3 =>
2B098B7FC189A87B41A7706EA7ABFFDB343B8B5AF3712BA6614E04BD3032A977 =>
D8CBDD03564C3547D8189D11A9BAE078FBD70986DBFB485EAEE5170C13113798 =>
F5D7AE29DB432EC3421EE956B70927AE394C0F27CE00FF855666DBC3E14084DB <=
85795253C6CCDC3CC2A4CAE055CC7478946CDB33D35EAE2BB5796C55954205B2 <=
9CE2A42C8FFA2D8001BA2874324987DCEF601173CB2ED8B654A76598F90B126E <=
IF you are going for the hash why not use the Group-Object instead of the Compare-Object. Something like this:
$Source = Get-ChildItem -recurse –Path E:\path
$Destination = Get-ChildItem -recurse –Path "\\server\e$\path"
$Source + $Destination | Group-Object #{Expression={(Get-FileHash $_.FullName).hash}} | ? {$_.Count -gt 1}
Output would be something like this:
Count Name Group
----- ---- -----
2 DF7E70E5021544F4834BBE... {b.txt, c.txt}
Compare-Object by default outputs differences,
if you want to compare Hash and Name (without path)
there is the problem that Get-FileHash only output's Algorithm,Hash and the complete Path.
You can directly pipe Get-ChildItem output to Get-FileHash,
but need to attach the Name (here using a calculated property)
I'd use the -PassThru parameter and use the whole objects specifying the properties Hash and Name for comparison.
## Q:\Test\2019\06\12\SO_565666700.ps1
$SourceDir = 'E:\path' # 'C:\Bat' #
$TargetDir = '\\server\e$\path' # 'K:\Bat' #
$Source = Get-ChildItem –Path $SourceDir -Recurse -PipeLineVar Item |
Get-FileHash | Select-Object *,#{n='Name';e={$Item.Name}}
$Target = Get-ChildItem –Path $TargetDir -Recurse -PipeLineVar Item |
Get-FileHash | Select-Object *,#{n='Name';e={$Item.Name}}
Compare-Object -ReferenceObject $Source -Property Name,Hash `
-DifferenceObject $Target -PassThru |
Sort-Object Name | Select-Object Hash,Path
I have many jpg's on a hardrive which are the same pictures.
Now I'd like to find them. Therefore I need to compare 2 folders and find the pictures which are the same.
I want to find the pictures with the same name AND the same LastWriteTime.
One of the criterias is not enough.
So, I need a Powershell code which could do that.
Here is what I have but it doesnt work well. I got also results which werent't correct.
Thats what I tried:
Get-ChildItem -Path $Pfad1 -Recurse -Filter *jpg |
Where-Object {(Get-ChildItem -Path $Pfad2 -Recurse -Filter *jpg) -match $_.Name -and $_.LastWriteTime} |
ForEach-Object {$_.FullName}
Without recursion you can compare the folders files with Compare-Object
$Folder1 = 'X:\path'
$Folder2 = 'Y:\path'
$Files1 = Get-ChildItem -Path $Folder1 -Filter *.jpg
$Files2 = Get-ChildItem -Path $Folder2 -Filter *.jpg
Compare-Object -Ref $Files1 -Diff $Files2 -Property Name,LastWriteTime `
-IncludeEqual -ExcludeDifferent -PassThru
If recursion is required and the subfolder has to be compared also you'll have to build a calculated property with the relative path to be comparable.
I am seeking help creating a PowerShell script which will search a specified path for multiple .xml files within the same folder.
The script should provide the full path of the file(s) if found.
The script should also provide a date.
Here's my code:
$Dir = Get-ChildItem C:\windows\system32 -Recurse
#$Dir | Get-Member
$List = $Dir | where {$_.Extension -eq ".xml"}
$List | Format-Table Name
$folder = "C:\Windows\System32"
$results = Get-ChildItem -Path $folder -File -Include "*.xml" | Select Name, FullName, LastWriteTime
This will return all xml files only and display the file name, full path to the file and last time it was written to. The "-File" switch is only available in Powershell 4 and up. So if doing it off a Windows 7 or Windows 2008 R2 Server, you will have to make sure you updated your WMF to 4 or higher. Without file the second like will look like.
#Powershell 2.0
$results = Get-ChildItem -Path $folder -Include "*.xml" | Where {$_.PSIsContainer -eq $false} | Select Name, FullName, LastWriteTime
I like the Select method mentioned above for the simpler syntax, but if for some reason you just want the file names with their absolute path and without the column header that comes with piping to Select (perhaps because it will be used as input to another script, or piped to another function) you could do the following:
$folder = 'C:\path\to\folder'
Get-ChildItem -Path $folder -Filter *.xml -File -Name | ForEach-Object {
[System.IO.Path]::GetFullPath($_)
}
I'm not sure if Select lets you leave out the header.
You could also take a look at this answer to give you some more ideas or things to try if you need the results sorted, or the file extension removed:
https://stackoverflow.com/a/31049571/10193624
I was able to make a few changes exporting the results to a .txt file, but though it provides the results I only want to isolate the same .xml files.
$ParentFolder = "C:\software"
$FolderHash = #{}
$Subfolders = Get-ChildItem -Path $ParentFolder
foreach ($EventFolder in $Subfolders) {
$XMLFiles = Get-ChildItem -Path $EventFolder.fullname -Filter *.xml*
if ($XMLFiles.Count -gt 1) {
$FolderHash += #{$EventFolder.FullName = $EventFolder.LastWriteTime}
}
}
$FolderHash
Judging from your self-answer you want a list of directories that contain more than one XML file without recursively searching those directories. In that case your code could be simplified to something like this:
Get-ChildItem "${ParentFolder}\*\*.xml" |
Group-Object Directory |
Where-Object { $_.Count -ge 2 } |
Select-Object Name, #{n='LastWriteTime';e={(Get-Item $_.Name).LastWriteTime}}
I have this PowerShell code that compares 2 directories and removes files if the files no longer exist in the source directory.
For example say I have Folder 1 & Folder 2. I want to compare Folder 1 with Folder 2, If a file doesn't exist anymore in Folder 1 it will remove it from Folder 2.
this code works ok but I have a problem where it also picks up file differences on the date/time. I only want it to pick up a difference if the file doesn't exist anymore in Folder 1.
Compare-Object $source $destination -Property Name -PassThru | Where-Object {$_.SideIndicator -eq "=>"} | % {
if(-not $_.FullName.PSIsContainer) {
UPDATE-LOG "File: $($_.FullName) has been removed from source"
Remove-Item -Path $_.FullName -Force -ErrorAction SilentlyContinue
}
}
Is there an extra Where-Object {$file1 <> $file2} or something like that.?
I am not sure how you are getting the information for $source and $destination I am assuming you are using Get-ChildItem
What i would do to eliminate the issue with date/time would be to not capture it in these variables. For Example:
$source = Get-ChildItem C:\temp\Folder1 -Recurse | select -ExpandProperty FullName
$destination = Get-ChildItem C:\temp\Folder2 -Recurse | select -ExpandProperty FullName
By doing this you only get the FullName Property for each object that is a child item not the date/time.
You would need to change some of the script after doing this for it to still work.
If I am not getting it wrong, the issue is your code is deleting the file with different time-stamp as compared to source:
Did you try -ExcludeProperty?
$source = Get-ChildItem "E:\New folder" -Recurse | select -ExcludeProperty Date
The following script can serve your purpose
$Item1=Get-ChildItem 'SourcePath'
$Item2=Get-ChildItem 'DestinationPath'
$DifferenceItem=Compare-Object $Item1 $Item2
$ItemToBeDeleted=$DifferenceItem | where {$_.SideIndicator -eq "=>" }
foreach ($item in $ItemToBeDeleted)
{
$FullPath=$item.InputObject.FullName
Remove-Item $FullPath -Force
}
Try something like this
In PowerShell V5:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 -file).Name
gci $yourdir2 -file | where Name -notin $filesnamedir1| remove-item
In old PowerShell:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 | where {$_.psiscontainer -eq $false}).Name
gci $yourdir2 | where {$_.psiscontainer -eq $false -and $_.Name -notin $filesnamedir1} | remove-item
If you want to compare files in multiple dir, use the -recurse option for every gci command.
I see some kind of similar posts but unfortunately this is not working for me so far. I would like to get the latest backup file (randomly generated) and then copy and rename the file in the same folder with fixed name so that I can schedule the restore job.
Backup files are like:
Fin123.bak
Fin125.bak
Sales456.bak
HRF100.bak
I would like to get the latest file (by creation date) and then copy and rename the file like Fin.bak. In the above files, Fin125.bak is latest backup file which I would like to copy and rename as Fin.bak. The powershell script should ignore all other files like Sales, HR, etc. Only files starting with Fin and having the latest creation date.
Any help is greatly appreciated.
Thanks
Muhammad
This should do the trick.
$bak_path = "path_to_backup_folder"
get-childitem -path $bak_path -Filter "Fin?*.bak" |
where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1 |
copy-item -Destination (join-path $bak_path "FIN.BAK")
Going through it;
$bak_path = "path_to_backup_folder" declare the path to the backup directory as a constant.
get-childitem -path $bak_path -Filter "Fin?*.bak" return a set of objects in folder $bak_path that match the filter "Fin?.bak". The '?' is used to ensure only files with at least one character between 'fin' and '.bak' match.
where-object { -not $_.PSIsContainer } removes any directory objects (just in case).
sort-object -Property $_.CreationTime sorts the list so that the newest file is last in the collection.
select-object -last 1 picks off the last item (ie newest file) and finally
copy-item -Destination (join-path $bak_path "FIN.BAK") copies the last item as FIN.BAK to the same directory
This will work for grabbing the most recent file of the .bak extension.
$OriginalDir = "E:\"
$BackupDir = "H:"
#After the -Name can be changed for whatever you need to backup.
$LatestFile = Get-ChildItem -Path $dir -Name *.bak | Sort-Object LastAccessTime -Descending | Select-Object -First 1
Copy-Item -path "$OriginalDir\$LatestFile" "$BackupDir\$LatestFile"
Wesley's script works OK but have a little error, it use $dir but must be $OriginalDir
$LatestFile = Get-ChildItem -Path $OriginalDir -Name *.bak | Sort-Object LastAccessTime -Descending | Select-Object -First 1
this works well- but you need to change to Last--becoz- list is descending
Select-Object -Last 1