I try to add NTFS access to the shortcuts.
I have the csv file that contains:
Name,AD
Steps Recorder,Group_312312
Snipping Tool,Group_545345
$FolderPath = "C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories\"
$file = "C:\Users\adm\Desktop\Group.csv"
$groups = Get-Content $file | ConvertFrom-Csv
foreach ($group in $groups){
Add-NTFSAccess -Path (Join-Path -Path $FolderPath -ChildPath ($group.Name+".lnk")) `
-Account $group.AD `
-AccessRights ReadAndExecute `
}
I have a lot of subfolders with *.lnk files in $FolderPath. But in this way, the scripts find only in $FolderPath without subfolders. How can I change the script to find all *.lnk files include subfolders?
For example:
C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories\My_programs\OneDrive.lnk
For this, I think you need a different approach, where you get a collection of *.lnk files recursively and filter to get only those which have a BaseName property that can be found in the CSV.
Next, use Group-Object to group (make sub-collections) of these FileInfo objects, based on their BaseName.
According to the docs, the Path parameter on the Add-NTFSAccess cmdlet can take an array of paths (FullName properties) and these can be piped through to it, so we can send each subcollection all at once:
$FolderPath = "C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Accessories"
$file = "C:\Users\adm\Desktop\Group.csv"
$groups = Import-Csv -Path $file
# get a list of *.lnk FileIfo objects where the file's BaseName can be found in the
# CSV column 'Name'. Group these files on their BaseName properties
$linkfiles = Get-ChildItem -Path $FolderPath -Filter '*.lnk' -File -Recurse -Force |
Where-Object { $groups.Name -contains $_.BaseName } |
Group-Object BaseName
# iterate through the grouped *.lnk files
$linkfiles | ForEach-Object {
$baseName = $_.Name # the name of the Group is the BaseName of the files in it
$adGroup = ($groups | Where-Object { $_.Name -eq $baseName }).AD
# pipe all items in the group through to the Add-NTFSAccess cmdlet
# see parameter Path at https://ntfssecurity.readthedocs.io/en/latest/Cmdlets/Add-NTFSAccess/
$_.Group | Add-NTFSAccess -Account $adGroup -AccessRights ReadAndExecute
}
UPDATE
# this is where the output 'log' csv file goes
$outputFile = "C:\Users\adm\Desktop\GroupReport.csv"
# get a list of *.lnk FileIfo objects where the file's BaseName can be found in the
# CSV column 'Name'. Group these files on their BaseName properties
$linkfiles = Get-ChildItem -Path $FolderPath -Filter '*.lnk' -File -Recurse -Force |
Where-Object { $groups.Name -contains $_.BaseName } |
Group-Object BaseName
# iterate through the grouped *.lnk files and add the group permission
# capture the results in a variable $log to output as CSV
$linkfiles | ForEach-Object {
$baseName = $_.Name # the name of the Group is the BaseName of the files in it
$adGroup = ($groups | Where-Object { $_.Name -eq $baseName }).AD
# create a new access rule
# see: https://learn.microsoft.com/en-us/dotnet/api/system.security.accesscontrol.filesystemaccessrule
$rule = [System.Security.AccessControl.FileSystemAccessRule]::new($adGroup, "ReadAndExecute", "Allow")
$_.Group | ForEach-Object {
# get the current ACL of the file
$acl = Get-Acl -Path $_.FullName
# add the new rule to the ACL
$acl.SetAccessRule($rule)
$acl | Set-Acl $_.FullName
# output for logging csv
[PsCustomObject]#{
'Group' = $adGroup
'File' = $_.FullName
}
}
} | Export-Csv -Path $outputFile -NoTypeInformation
Related
I am using the following script to read a list of file names which are then deleted. Is there a way can get an output of the date and time each file is deleted?
$targetFolder = "D:\" $fileList = "C:\DeleteList.txt" Get-ChildItem
-Path "$targetFolder\*" -Recurse -Include #(Get-Content $fileList) | Remove-Item -Verbose
Thanks for any help.
You could keep track of the files that are deleted and the time of deletion by outputting an object with the file's fullname and current date.
This output can then be saved as structured CSV file
$targetFolder = "D:\"
$fileList = Get-Content -Path "C:\DeleteList.txt"
$deleted = Get-ChildItem -Path $targetFolder -Recurse -Include $fileList | ForEach-Object {
# output an object with the current date and the file FullName
$_ | Select-Object #{Name = 'DeletedOn'; Expression = {(Get-Date)}}, FullName
$_ | Remove-Item -WhatIf
}
# output on screen
$deleted | Format-Table -AutoSize
# output to csv file
$deleted | Export-Csv -Path 'C:\RemovedFiles.csv' -NoTypeInformation
Remove the -WhatIf safety-switch if you are satisfied with the results shown on screen.
Would this work?
$targetFolder = "D:"
$fileList = "C:\DeleteList.txt"
$Files = Get-ChildItem -Path "$targetFolder" -Recurse -Include #(Get-Content $fileList)
# Once you have the desires files stored in the $Files variable, then run a Foreach loop.
$Obj = #() # create an array called $Obj
Foreach ($File in $Files)
{
# store info in hash table
$hash = #{
DateTime = (get-date)
fileName = $File.name
fullpath = $File.fullname
}
Write-Host "deleting file $($file.name)" -for cyan
Remove-Item $File.fullname # *** BE VERY CAREFUL!!!***
# record information in an array called $Obj
$Obj += New-Object psobject -Property $hash
}
$Obj | select fileName, DateTime | Export-csv C:\...
I´m trying to get a
a) list of all empty folders and subfolders if the folder is named "Archiv"
b) I´d like to delete all those empty folders. My current approch doesn´t check the subfolders.
It would be also great if the results would be exportet in a .csv =)
$TopDir = 'C:\Users\User\Test'
$DirToFind = 'Archiv'>$EmptyDirList = #(
Get-ChildItem -LiteralPath $TopDir -Directory -Recurse |
Where-Object {
#[System.IO.Directory]::GetFileSystemEntries($_.FullName).Count -eq 0
$_.GetFileSystemInfos().Count -eq 0 -and
$_.Name -match $DirToFind
}
).FullName
$EmptyDirList
Any ideas how to adjust the code? Thanks in advance
You need to reverse the order in which Get-ChildItem lists the items so you can remove using the deepest nested empty folder first.
$LogFile = 'C:\Users\User\RemovedEmptyFolders.log'
$TopDir = 'C:\Users\User\Test'
# first get a list of all folders below the $TopDir directory that are named 'Archiv' (FullNames only)
$archiveDirs = (Get-ChildItem -LiteralPath $TopDir -Filter 'Archiv' -Recurse -Directory -Force).FullName |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object -Property Length -Descending
# next, remove all empty subfolders in each of the $archiveDirs
$removed = foreach ($dir in $archiveDirs) {
(Get-ChildItem -LiteralPath $dir -Directory -Force) |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object #{Expression = {$_.FullName.Length}} -Descending |
ForEach-Object {
# if this folder is empty, remove it and output its FullName for the log
if (#($_.GetFileSystemInfos()).Count -eq 0) {
$_.FullName
Remove-Item -LiteralPath $_.FullName -Force
}
}
# next remove the 'Archiv' folder that is now possibly empty too
if (#(Get-ChildItem -LiteralPath $dir -Force).Count -eq 0) {
# output this folders fullname and delete
$dir
Remove-Item -LiteralPath $dir -Force
}
}
$removed | Set-Content -Path $LogFile -PassThru # write your log file. -PassThru also writes the output on screen
Not sure a CSV is needed, I think a simple text file will suffice as it's just a list.
Anyway, here's (although not the most elegant) a solution which will also delete "nested empty directories". Meaning if a directory only contains empty directorIS, it will also get deleted
$TopDir = "C:\Test" #Top level directory to scan
$EmptyDirListReport = "C:\EmptyDirList.txt" #Text file location to store a file with the list of deleted directorues
if (Test-Path -Path $EmptyDirListReport -PathType Leaf)
{
Remove-Item -Path $EmptyDirListReport -Force
}
$EmptyDirList = ""
Do
{
$EmptyDirList = Get-ChildItem -Path $TopDir -Recurse | Where-Object -FilterScript { $_.PSIsContainer } | Where-Object -FilterScript { ((Get-ChildItem -Path $_.FullName).Count -eq 0) } | Select-Object -ExpandProperty FullName
if ($EmptyDirList)
{
$EmptyDirList | Out-File -FilePath $EmptyDirListReport -Append
$EmptyDirList | Remove-Item -Force
}
} while ($EmptyDirList)
This should do the trick, should works with nested too.
$result=(Get-ChildItem -Filter "Archiv" -Recurse -Directory $topdir | Sort-Object #{Expression = {$_.FullName.Length}} -Descending | ForEach-Object {
if ((Get-ChildItem -Attributes d,h,a $_.fullname).count -eq 0){
$_
rmdir $_.FullName
}
})
$result | select Fullname |ConvertTo-Csv |Out-File $Logfile
You can do this with a one-liner:
> Get-ChildItem -Recurse dir -filter Archiv |
Where-Object {($_ | Get-ChildItem).count -eq 0} |
Remove-Item
Although, for some reason, if you have nested Archiv files like Archiv/Archiv, you need to run the line several times.
Okay so we are setting up a card access system that looks at the Active Directory Users thumbnailPhoto attribute. I am creating an audit system that exports the Users and compares them with the JPG images. If the image exists but there isn't a correlating user, it moves the image into an archive to be reviewed. The goal is to remove old employee photos into a folder incase of later hire. I can't get the image to move into another folder if it matches a name in the CSV. Here is the entire code:
<#Write Users to a CSV File #>
$adUsers = get-aduser -filter * -properties displayname | select displayname | export-csv -path PATHWAY.CSV -notypeinformation -encoding unicode
$keepImages = #()
$removeImages = #()
[System.Collections.ArrayList]$arrA = (Get-Childitem -Filter * -path PATHWAY).Basename
[System.Collections.ArrayList]$arrB = Get-Content PATHWAY.CSV
foreach ($itemA in $arrA) {
if ($arrB -ne $itemA) {
$arrB.Remove($itemA)
$removeImages += $itemA }}
$removeImages |out-file -FilePath PATH.csv
<# PUT THE FILES INTO AN ARCHIVE #>
--Cant get it to move here, note I am brand new to Powershell, its not like python at all--
You can try this. I have added inline comments to hopefully explain how it works:
$ImagesFolder = 'D:\UserImages'
$OldUserImages = 'D:\UserImages\OldUsers'
# test if the path to move old images exists and if not create it
if (!(Test-Path -Path $OldUserImages -PathType Container)) {
$null = New-Item -Path $OldUserImages -ItemType Directory
}
# get a list of ADUser display names
$adUsers = Get-ADUser -Filter * -Properties DisplayName | Select-Object -ExpandProperty DisplayName
# get an array of FileInfo objects of the user images currently in the $ImagesFolder.
# filter out only those that do not have a basename that correlates to any of the users DisplayName
# and move these to the $OldUserImages folder.
# Tip: if for instance all are of type JPG, add -Filter '*.jpg' to the Get-ChildItem cmdlet.
Get-ChildItem -Path $ImagesFolder -File |
Where-Object { $adUsers -notcontains $_.BaseName } |
Move-Item -Destination $OldUserImages -Force
If you want to keep track of the images you have moved, you can extend the above like:
$moved = Get-ChildItem -Path $ImagesFolder -File |
Where-Object { $adUsers -notcontains $_.BaseName } |
ForEach-Object {
$file = $_.FullName
$_ | Move-Item -Destination $OldUserImages -Force
[PsCustomObject]#{
'File' = $file
'MovedTo' = $OldUserImages
}
}
# show result on screen
$moved | Format-Table -AutoSize
# write to CSV file
$out = '{0:yyyy-MM-dd}_MovedImages.csv' -f (Get-Date)
$moved | Export-Csv -Path (Join-Path -Path $ImagesFolder -ChildPath $out) -NoTypeInformation
I have the following code that prints the file system rights of each account enabled on the folder with path "C:\Temp\CSM\*" & "C:\Temp\CSM\*\*" . How do I write the output in a comma-separated CSV? As this is for PowerShell 2.0 Export-Csv -Append parameter cannot be used.
$FolderPath = dir -Directory -Path "C:\Temp\CSM\*", "C:\Temp\CSM\*\*" -Force
foreach ($Folder in $FolderPath) {
$Acl = Get-Acl -Path $Folder.FullName
foreach ($Access in $acl.Access) {
Write-Host $Folder.FullName "," $Access.IdentityReference "," $Access.FileSystemRights
}
}
Prior to PowerShell v3 if you wanted to append to an existing CSV you need something like this:
... | ConvertTo-Csv -NoType | Select-Object -Skip 1 | Add-Content
However, in your scenario that probably isn't necessary. If you replace your foreach loops with a pipeline you can write the CSV in one go without having to append to it in a loop (which isn't recommended anyway).
$folders = "C:\Temp\CSM\*", "C:\Temp\CSM\*\*"
Get-ChildItem -Path $folders -Directory -Force | ForEach-Object {
$path = $_.FullName
Get-Acl -Path $path |
Select-Object -Expand Access |
Select-Object #{n='Path';e={$path}}, IdentityReference, FileSystemRights
} | Export-Csv 'C:\output.csv' -NoType
Every day we receive a zipfile from a number of clients. The filename consists of the following:
data_clientname_timestamp.zip
Where "data" is always the same text, "clientname" could be anything and "timestamp" is the file creation date.
The files are always in the same directory. The clientnames are always known in advance, so I know what files should be received.
The script should do the following:
List all files received (created) today
If a file from one or more clients is missing, write "file from client.. missing" to a file
I would like to list the clients in a variable, so those can easily be changed.
What I have so far:
$folder='C:\data'
Get-ChildItem $folder -recurse -include #("*.zip") |
Where-Object {($_.CreationTime -gt (Get-Date).Date )} | select name | out-file $folder\result.txt
But how to check the file for missing files?
Edit:
Testdata:
$Timestamp = (Get-Date).tostring(“yyyyMMddhhmmss”)
New-Item c:\Data -type Directory
New-Item c:\Data\Data_client1_$Timestamp.zip -type file
New-Item c:\Data\Data_client2_$Timestamp.zip -type file
New-Item c:\Data\Data_client3_$Timestamp.zip -type file
New-Item c:\Data\Data_client5_$Timestamp.zip -type file
New-Item c:\Data\Data_client6_$Timestamp.zip -type file
New-Item c:\Data\Data_client7_$Timestamp.zip -type file
exit
Script:
$folder='C:\Data'
$clients = #("client1", "client2", "client3", "client4", "client5", "client6", "client7")
$files = Get-ChildItem $folder -recurse -include #("*.zip") |
Where-Object {($_.CreationTime -gt (Get-Date).Date )}
$files | Select-Object Name | Out-File $folder\result.txt
$files | Where-Object { ($_.Name -replace '.+?_([^_]+).*', '$1') -notin $clients} | Out-File $folder\result2.txt
Start with defining a list of clients you would expect like:
$clients = #("client1", "client2")
Then retrieve all zip files and save it to a variable:
$files = Get-ChildItem $folder -recurse -include #("*.zip") |
Where-Object {($_.CreationTime -gt (Get-Date).Date )}
Export the existing files to your result.txt:
$files | Select-Object Name | Out-File $folder\result.txt
Now you can determine each missing client using the Where-Object cmdlet with a regex that grabs the clientname:
$fileClients = $files | ForEach-Object { ($_.Name -replace '.+?_([^_]+).*', '$1') }
Compare-Object $clients $fileClients | select -ExpandProperty InputObject | Out-File $folder\result2.txt
You need to have a list of your clients somewhere (such as in a CSV file named clients.csv) then you could loop through that list to check if a file is found for each client. For example:
$folder='C:\data'
$Clients = Import-CSV Clients.csv
$Files = Get-ChildItem $folder -recurse -include #("*.zip") | Where-Object {($_.CreationTime -gt (Get-Date).Date )} | select name
$Clients | ForEach-Object {
$Client = $_
$ClientCheck = $Files | Where-Object {$_ -like $Client}
If (-not $ClientCheck) {
Write-Warning "$Client is missing!"
}Else{
Write-Output $ClientCheck
}
} | out-file $folder\result.txt