Compare a folder of Images to a CSV file in Powershell - powershell

Okay so we are setting up a card access system that looks at the Active Directory Users thumbnailPhoto attribute. I am creating an audit system that exports the Users and compares them with the JPG images. If the image exists but there isn't a correlating user, it moves the image into an archive to be reviewed. The goal is to remove old employee photos into a folder incase of later hire. I can't get the image to move into another folder if it matches a name in the CSV. Here is the entire code:
<#Write Users to a CSV File #>
$adUsers = get-aduser -filter * -properties displayname | select displayname | export-csv -path PATHWAY.CSV -notypeinformation -encoding unicode
$keepImages = #()
$removeImages = #()
[System.Collections.ArrayList]$arrA = (Get-Childitem -Filter * -path PATHWAY).Basename
[System.Collections.ArrayList]$arrB = Get-Content PATHWAY.CSV
foreach ($itemA in $arrA) {
if ($arrB -ne $itemA) {
$arrB.Remove($itemA)
$removeImages += $itemA }}
$removeImages |out-file -FilePath PATH.csv
<# PUT THE FILES INTO AN ARCHIVE #>
--Cant get it to move here, note I am brand new to Powershell, its not like python at all--

You can try this. I have added inline comments to hopefully explain how it works:
$ImagesFolder = 'D:\UserImages'
$OldUserImages = 'D:\UserImages\OldUsers'
# test if the path to move old images exists and if not create it
if (!(Test-Path -Path $OldUserImages -PathType Container)) {
$null = New-Item -Path $OldUserImages -ItemType Directory
}
# get a list of ADUser display names
$adUsers = Get-ADUser -Filter * -Properties DisplayName | Select-Object -ExpandProperty DisplayName
# get an array of FileInfo objects of the user images currently in the $ImagesFolder.
# filter out only those that do not have a basename that correlates to any of the users DisplayName
# and move these to the $OldUserImages folder.
# Tip: if for instance all are of type JPG, add -Filter '*.jpg' to the Get-ChildItem cmdlet.
Get-ChildItem -Path $ImagesFolder -File |
Where-Object { $adUsers -notcontains $_.BaseName } |
Move-Item -Destination $OldUserImages -Force
If you want to keep track of the images you have moved, you can extend the above like:
$moved = Get-ChildItem -Path $ImagesFolder -File |
Where-Object { $adUsers -notcontains $_.BaseName } |
ForEach-Object {
$file = $_.FullName
$_ | Move-Item -Destination $OldUserImages -Force
[PsCustomObject]#{
'File' = $file
'MovedTo' = $OldUserImages
}
}
# show result on screen
$moved | Format-Table -AutoSize
# write to CSV file
$out = '{0:yyyy-MM-dd}_MovedImages.csv' -f (Get-Date)
$moved | Export-Csv -Path (Join-Path -Path $ImagesFolder -ChildPath $out) -NoTypeInformation

Related

Powershell Find all empty folders and subfolders in a given Folder name

I´m trying to get a
a) list of all empty folders and subfolders if the folder is named "Archiv"
b) I´d like to delete all those empty folders. My current approch doesn´t check the subfolders.
It would be also great if the results would be exportet in a .csv =)
$TopDir = 'C:\Users\User\Test'
$DirToFind = 'Archiv'>$EmptyDirList = #(
Get-ChildItem -LiteralPath $TopDir -Directory -Recurse |
Where-Object {
#[System.IO.Directory]::GetFileSystemEntries($_.FullName).Count -eq 0
$_.GetFileSystemInfos().Count -eq 0 -and
$_.Name -match $DirToFind
}
).FullName
$EmptyDirList
Any ideas how to adjust the code? Thanks in advance
You need to reverse the order in which Get-ChildItem lists the items so you can remove using the deepest nested empty folder first.
$LogFile = 'C:\Users\User\RemovedEmptyFolders.log'
$TopDir = 'C:\Users\User\Test'
# first get a list of all folders below the $TopDir directory that are named 'Archiv' (FullNames only)
$archiveDirs = (Get-ChildItem -LiteralPath $TopDir -Filter 'Archiv' -Recurse -Directory -Force).FullName |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object -Property Length -Descending
# next, remove all empty subfolders in each of the $archiveDirs
$removed = foreach ($dir in $archiveDirs) {
(Get-ChildItem -LiteralPath $dir -Directory -Force) |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object #{Expression = {$_.FullName.Length}} -Descending |
ForEach-Object {
# if this folder is empty, remove it and output its FullName for the log
if (#($_.GetFileSystemInfos()).Count -eq 0) {
$_.FullName
Remove-Item -LiteralPath $_.FullName -Force
}
}
# next remove the 'Archiv' folder that is now possibly empty too
if (#(Get-ChildItem -LiteralPath $dir -Force).Count -eq 0) {
# output this folders fullname and delete
$dir
Remove-Item -LiteralPath $dir -Force
}
}
$removed | Set-Content -Path $LogFile -PassThru # write your log file. -PassThru also writes the output on screen
Not sure a CSV is needed, I think a simple text file will suffice as it's just a list.
Anyway, here's (although not the most elegant) a solution which will also delete "nested empty directories". Meaning if a directory only contains empty directorIS, it will also get deleted
$TopDir = "C:\Test" #Top level directory to scan
$EmptyDirListReport = "C:\EmptyDirList.txt" #Text file location to store a file with the list of deleted directorues
if (Test-Path -Path $EmptyDirListReport -PathType Leaf)
{
Remove-Item -Path $EmptyDirListReport -Force
}
$EmptyDirList = ""
Do
{
$EmptyDirList = Get-ChildItem -Path $TopDir -Recurse | Where-Object -FilterScript { $_.PSIsContainer } | Where-Object -FilterScript { ((Get-ChildItem -Path $_.FullName).Count -eq 0) } | Select-Object -ExpandProperty FullName
if ($EmptyDirList)
{
$EmptyDirList | Out-File -FilePath $EmptyDirListReport -Append
$EmptyDirList | Remove-Item -Force
}
} while ($EmptyDirList)
This should do the trick, should works with nested too.
$result=(Get-ChildItem -Filter "Archiv" -Recurse -Directory $topdir | Sort-Object #{Expression = {$_.FullName.Length}} -Descending | ForEach-Object {
if ((Get-ChildItem -Attributes d,h,a $_.fullname).count -eq 0){
$_
rmdir $_.FullName
}
})
$result | select Fullname |ConvertTo-Csv |Out-File $Logfile
You can do this with a one-liner:
> Get-ChildItem -Recurse dir -filter Archiv |
Where-Object {($_ | Get-ChildItem).count -eq 0} |
Remove-Item
Although, for some reason, if you have nested Archiv files like Archiv/Archiv, you need to run the line several times.

Add NTFS access to files with subfolders from CSV file

I try to add NTFS access to the shortcuts.
I have the csv file that contains:
Name,AD
Steps Recorder,Group_312312
Snipping Tool,Group_545345
$FolderPath = "C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories\"
$file = "C:\Users\adm\Desktop\Group.csv"
$groups = Get-Content $file | ConvertFrom-Csv
foreach ($group in $groups){
Add-NTFSAccess -Path (Join-Path -Path $FolderPath -ChildPath ($group.Name+".lnk")) `
-Account $group.AD `
-AccessRights ReadAndExecute `
}
I have a lot of subfolders with *.lnk files in $FolderPath. But in this way, the scripts find only in $FolderPath without subfolders. How can I change the script to find all *.lnk files include subfolders?
For example:
C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories\My_programs\OneDrive.lnk
For this, I think you need a different approach, where you get a collection of *.lnk files recursively and filter to get only those which have a BaseName property that can be found in the CSV.
Next, use Group-Object to group (make sub-collections) of these FileInfo objects, based on their BaseName.
According to the docs, the Path parameter on the Add-NTFSAccess cmdlet can take an array of paths (FullName properties) and these can be piped through to it, so we can send each subcollection all at once:
$FolderPath = "C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories"
$file = "C:\Users\adm\Desktop\Group.csv"
$groups = Import-Csv -Path $file
# get a list of *.lnk FileIfo objects where the file's BaseName can be found in the
# CSV column 'Name'. Group these files on their BaseName properties
$linkfiles = Get-ChildItem -Path $FolderPath -Filter '*.lnk' -File -Recurse -Force |
Where-Object { $groups.Name -contains $_.BaseName } |
Group-Object BaseName
# iterate through the grouped *.lnk files
$linkfiles | ForEach-Object {
$baseName = $_.Name # the name of the Group is the BaseName of the files in it
$adGroup = ($groups | Where-Object { $_.Name -eq $baseName }).AD
# pipe all items in the group through to the Add-NTFSAccess cmdlet
# see parameter Path at https://ntfssecurity.readthedocs.io/en/latest/Cmdlets/Add-NTFSAccess/
$_.Group | Add-NTFSAccess -Account $adGroup -AccessRights ReadAndExecute
}
UPDATE
# this is where the output 'log' csv file goes
$outputFile = "C:\Users\adm\Desktop\GroupReport.csv"
# get a list of *.lnk FileIfo objects where the file's BaseName can be found in the
# CSV column 'Name'. Group these files on their BaseName properties
$linkfiles = Get-ChildItem -Path $FolderPath -Filter '*.lnk' -File -Recurse -Force |
Where-Object { $groups.Name -contains $_.BaseName } |
Group-Object BaseName
# iterate through the grouped *.lnk files and add the group permission
# capture the results in a variable $log to output as CSV
$linkfiles | ForEach-Object {
$baseName = $_.Name # the name of the Group is the BaseName of the files in it
$adGroup = ($groups | Where-Object { $_.Name -eq $baseName }).AD
# create a new access rule
# see: https://learn.microsoft.com/en-us/dotnet/api/system.security.accesscontrol.filesystemaccessrule
$rule = [System.Security.AccessControl.FileSystemAccessRule]::new($adGroup, "ReadAndExecute", "Allow")
$_.Group | ForEach-Object {
# get the current ACL of the file
$acl = Get-Acl -Path $_.FullName
# add the new rule to the ACL
$acl.SetAccessRule($rule)
$acl | Set-Acl $_.FullName
# output for logging csv
[PsCustomObject]#{
'Group' = $adGroup
'File' = $_.FullName
}
}
} | Export-Csv -Path $outputFile -NoTypeInformation

PowerShell Test If a Filename From a List Exists Somewhere In a Directory and Export Missing

I've researched this and haven't been able to come up with a solid solution. Basically, I have a separate hard drive containing thousands of music files. I have a CSV list with the names of all the files that should be in the hard drive. Example:
My List
I want to be able to test if each of the files on my list exist in the hard drive, and if not, export it to a separate "missing files" list. The thing is each of the files in the hard drive exist under multiple folders.
As my script is now, I am trying to test if the path exists by using join-path. Here is my code right now - it's returning all of the files in the directory instead of just the missing files:
$documents = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\TestResults.csv'
$obj = #()
Write-host "`n_____Results____`n" #Write the results and export to a CSV file
$NewCsv = Import-CSV -Path 'C:\Users\Me\Documents\ScriptTest\Test.csv' |
Select-Object ID,'File Path' |
ForEach-Object {
if (!(Test-Path (Join-Path $documents $_.'File Path'))){
write-host "`n$($_.'File Path') is missing from the folder.`n"
$ObjectProperties = #{
ID = $_.ID
'File Path' = $_.'File Path'
}
$obj += New-Object PSObject -Property $ObjectProperties
}
}
$obj | export-csv $CSVexport -NoTypeInformation
How do I account for the sub-directories that vary with each file?
Edit - Resolved
$myFolder = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\Results.csv'
$csvPath = 'C:\Users\Me\Documents\ScriptTest\Test.csv'
$FileList = Get-ChildItem $myFolder -Recurse *.wav | Select-Object -ExpandProperty Name -Unique
Import-CSV -Path $csvPath |
Where-Object {$FileList -notcontains $_.'File Path'} |
export-csv $CSVexport -NoTypeInformation
You could generate a list of filenames from the recursed folders, then check if the file is in that list.
$documents = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\TestResults.csv'
$FileList = Get-ChildItem $documents -Recurse |
Where-Object { -not $_.PSIsContainer } |
Select-Object -ExpandProperty Name -Unique
Import-CSV -Path 'C:\Users\Me\Documents\ScriptTest\Test.csv' |
Where-Object {$FileList -notcontains $_.File} |
Export-CSV $CSVexport -NoTypeInformation
Edit: Answer updated to work with PowerShell 2.0 with suggestions from Bruce Payette and mklement0

Import .csv to create a list of filenames and corresponding owners

I am working on creating a script that will read a .csv document containing a single column of filenames (one per cell) and search a larger folder for each of the files matching the filenames provided and identify the 'owner' using:
(get-acl $file).owner
Currently I have several bits of code that can do individual parts, but I am having a hard time tying it all together. Ideally, a user can simply input file names into the .csv file, then run the script to output a second .csv or .txt identifying each file name and it's owner.
csv formatting will appear as below (ASINs is header):
ASINs
B01M8N1D83.MAIN.PC_410
B01M14G0JV.MAIN.PC_410
Pull file names without header:
$images = Get-Content \\path\ASINs.csv | Select -skip 1
Find images in larger folder to pull full filename/path (not working):
ForEach($image in $images) {
$images.FullName | ForEach-Object
{
$ASIN | Get-ChildItem -Path $serverPath -Filter *.jpg -Recurse -ErrorAction SilentlyContinue -Force | Set-Content \\path\FullNames.csv
}
}
At that point I would like to use the full file paths provided by FullNames.csv to pull the owners from the files in their native location using the above mentioned:
(get-acl $file).owner
Does anyone have any ideas how to tie these together into one fluid script?
EDIT
I was able to get the following to work without the loop, reading one of the filenames, but I need it to loop as there are multiple filenames.
New CSV Format:
BaseName
B01LVVLSCM.MAIN.PC_410
B01LVY65AN.MAIN.PC_410
B01MAXORH6.MAIN.PC_410
B01MTGEMEE.MAIN.PC_410
New Script:
$desktopPath = [System.Environment]::GetFolderPath([System.Environment+SpecialFolder]::Desktop)
$images = $desktopPath + '\Get_Owner'
Get-ChildItem -Path $images | Select BaseName | Export-Csv $desktopPath`\Filenames.csv -NoTypeInformation
$serverPath = 'C:\Users\tuggleg\Desktop\Archive'
$files = Import-Csv -Path $desktopPath`\Filenames.csv
While($true) {
ForEach ($fileName in $files.BaseName)
{
Get-ChildItem -Path $serverPath -Filter "*$fileName*" -Recurse -ErrorAction 'SilentlyContinue' |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path $desktopPath`\Owners.csv -NoTypeInformation
}
}
Any ideas on the loop issue? Thanks everyone!
This example assumes your csv contains partial filenames. It will search the filepath and filter for those partials.
Example.csv
"ASINs"
"B01M8N1D83.MAIN.PC_410"
"B01M14G0JV.MAIN.PC_410"
Code.ps1
$Files = Import-Csv -Path '.\Example.csv'
ForEach ($FileName in $Files.ASINs)
{
Get-ChildItem -Path $serverPath -Filter "*$FileName*" -Recurse -ErrorAction 'SilentlyContinue' |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path '\\path\FullNames.csv' -NoTypeInformation
}

Powershell: Script to search all user profiles and copy the most recent to all user profiles

I am looking to write a powershell script to search all user profiles on a server for a specific file, compare the files by the lastmodifieddate, and then copy the newest file to all user profiles. The script will also create a backup of the last three versions of the file.
I previously wrote this script for our pilot environment where only two people were accessing the app (this is for a XenApp), but the user base has now expanded and I would like to create the prod version of the script to cover future growth.
Any help would be very much appreciated. Thanks! Script below...
$SRC1 = "\\Server\c$\Users\XXXX1\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC2 = "\\Server\c$\Users\XXXX2\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC3 = "\\Server\c$\Users\XXXX3\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$BKU = "\\storage\IT\EMSLM\Backup"
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
else {Copy-Item $SRC2 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC3).LastWriteTime ) {Copy-Item $SRC1 $SRC3}
else {Copy-Item $SRC3 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
Remove-Item $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist_old_1.xml $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist.xml $BKU\sitelist_old_1.xml
Copy-Item $SRC1 $BKU
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit
this isn't everything, but it should be a good place to start
$users = dir "\\Server\c$\Users" -Directory | select -ExpandProperty fullname
$newest = dir "\\Server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$files = #()
$users | % {
$files += $newest -replace [regex]::Escape($_)
}
$newestEnd = $files | sort {$_.length} | select -f 1
$users | % {
$dest = Join-Path $_ $newestEnd
copy $newest $dest -force
}
Working off of Anthony Stringer's response I was able to build a script that meets my exact needs. Anthony's script would have worked, but was missing a couple things that I wanted:
1.) Identify all profiles with an existing sitelist.xml file and place in an array or hash table.
2.) Copy only to those user profiles where the sitelist.xml file existed (my fault, I never requested this in my original question)
Thank you Anthony for the starting point. Updated script below:
$Users = dir "\\server\c$\Users" -Directory -Exclude Public, Default, Administrator* | select -ExpandProperty fullname
$FilePath = "AppData\Roaming\EMIESiteListManager\sitelist.xml"
$UserPath = Join-Path -path $Users $filePath
$NewestFile = dir "\\server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$BackUp = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup"
$BackUpFile = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup\sitelest.xml"
$EMSLM_Users = #()
$UserPath | ForEach {
If ((Test-Path -path $_) -eq $true)
{$EMSLM_Users += $_}
}
$EMSLM_Users | ForEach-Object {
Copy-Item $NewestFile $_ -force -erroraction silentlycontinue
}
If ($NewestFile.lastwritetime -gt $BackUpFile.lastwritetime)
{
Remove-Item $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist_old_1.xml $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist.xml $BackUp\sitelist_old_1.xml -and Copy-Item $NewestFile $BackUp
}
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit