Get Folder NTFS ACL on long path name - powershell

I have a PS script that will return NTFS ACLs if an individual user is assigned, works well until I hit a path exceeding 260 characters. I've found a lot of information on the path too long problem and some work-arounds but I'm struggling to integrate a solution into my script. Any suggestions?
Thanks!
$DateStart = Get-Date
$Path = "E:\"
$PermittedOU1 = "OU=Groups,dc=chiba,dc=localt"
$PermittedOU3 = "OU=System Accounts,OU=Accounts,dc=chiba,dc=local"
$PermittedACL1 = get-adgroup -Filter * -SearchBase $PermittedOU1
$PermittedACL3 = get-aduser -Filter * -SearchBase $PermittedOU3
$ObjectPathItem = Get-ChildItem -path $Path -Recurse | where-object {$_.PsIsContainer} | foreach- object -process { $_.FullName }
$howmany=0
$Logfilename = "C:\Users\administrator\Documents\$(get-date -f yyyy-MM-dd-hh-mm).csv"
Add-Content $Logfilename "$DateStart`n"
$totalfolders=0
$i=0
ForEach ($Folder in $ObjectPathItem)
{
$totalfolders++
}
Foreach ($Folder in $ObjectPathItem)
{
$ObjectACL = Get-ACL -Path $Folder
$i++
$howmany=0
Write-Progress -id 1 -Activity "Folder Recursion" -status "Folders Traversed: " -PercentComplete (($i / $totalfolders) * 100)
Foreach ($ACL in $ObjectACL.access)
{
$ACLstring = $ACL.identityreference.Value
$ACLstring = $ACLstring.Replace("CHIBA\","")
if (($ACLstring -notin $PermittedACL1.name)`
-and ($ACLstring -notin $PermittedACL3.SamAccountName)`
-and ($ACLstring -notin "NT AUTHORITY\SYSTEM") `
-and ($ACLstring -notin "BUILTIN\Administrators") `
-and ($ACLstring -notin "CREATOR OWNER"))
{
$newline = "`"$Folder`"" + "," + "$ACLString"
Add-Content $Logfilename "$newline"
$howmany+=1
}
else {
$howmany+=1
}
}
}
$DateEnd = Get-Date
Add-Content $Logfilename "`n`n$DateEnd"

One option you can usually use is to create a mapped drive using New-PSDrive. Something like:
Try{
$ObjectACL = Get-ACL -Path $Folder
}
Catch{
$SubPathLength = $Folder.FullName.substring(0,200).LastIndexOf('\')
$NewTempPath = $Folder.FullName.SubString(0,$SubPathLength)
New-PSDrive -Name Temp4ACL -Provider FileSystem -Root $NewTempPath
$ObjectACL = Get-ACL "Temp4ACL:$($Folder.FullName.SubSTring($SubPathLength,$Folder.FullName.Length-$SubPathLength))"
}
That will find the last \ before the 200th character in the path, grab a substring of the full path up to the end of that folder's name and create a temp drive of it, then get the ACL based off the temp drive and the remaining path. So this path:
C:\Temp\Subfolder\Really Long Folder Name\Another Subfolder\ABCDEFGHIJKLMNOPQRSTUVWXYZ\We Are Really Pushing It Now\Im Running Out Of Folder Name Ideas\Hello My Name Is Inigo Montoya\You Killed My Father Prepare To Die\ReadMe.txt
Gets cut at the second to last backslash. I would end up getting the ACL from:
Temp4ACL:\You Killed My Father Prepare To Die\ReadMe.txt

Easy way is to use "\\?" to support 32,767 characters.
$folder = "C:\MyFolder"
icacls "\\?\$folder"
https://msdn.microsoft.com/en-us/library/windows/desktop/aa364963(v=vs.85).aspx
In the ANSI version of this function, the name is limited to MAX_PATH characters. To extend this limit to 32,767 wide characters, call the Unicode version of the function (GetFullPathNameW), and prepend "\\?\" to the path.

Okay, this question is quite old but for those coming here as of today like myself I provide this information that I found through Google:
Microsoft Technet Script Center lists a "File System Security PowerShell Module" which claims that since version 3.0 it "leverages the AlphaFS (http://alphafs.codeplex.com) to work around the MAX_PATH limitation of 260 characters". At the time of this writing the module is at version 4.2.3.
The general idea of this module is described as "PowerShell only offers Get-Acl and Set-Acl but everything in between getting and setting the ACL is missing. This module closes the gap."
So without having tried this myself I suppose it should help in solving the OPs problem.
The module is also featured in a post by the "Hey, Scripting Guy! Blog".

Related

How can you replace multiple UNC paths of mapped drives via registry using PowerShell?

Our servers team has implemented a DFS, but users across the company still have drives mapped using the server name(s) at various sites. I'd like to push out a PS script that updates a SINGLE registry value (per drive).
My goal is to look through each drive letter key, if the key exists and the remote path starts with the server name then replace it with, the DFS name \\domain.com\SITE\+remainder of the path. This way users keep the same drive letters without having to "remap" their drives
Using Denver office as an example...
$OldServer = "\\denvernas01\"
$NewServer = "\\domain.com\DEN\"
$DriveLetterArray = "A","B","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"
foreach ($DriveLetter in $DriveLetterArray)
{ $Drives = Get-ItemPropertyValue HKCU:\Network\$DriveLetter -Name RemotePath -ErrorAction SilentlyContinue
$RemainingPath = $Drives.Replace($OldServer,"")
foreach ($Drive in $Drives)
{ if ($Drive -like "*$OldServer*")
{ Set-ItemProperty HKCU:\Network\$DriveLetter -Name RemotePath -Value "$NewServer"+"$RemainingPath" }}}
EDIT
^^^This currently works, but only if the server name in RemotePath is all lower case. I.e. the server variables are case sensitive. Any thoughts on how to define the $OldServer & $NewServer variables so it will work with case variations???? e.g. Denvernas01, DENVERNAS01 (or anything inbetween)
I've come across a few threads discussing New-PSDrive, Get-WMIObject, etc, but I'd really like to just replace this one registry value. This would be a good "patch" that would take some stress off of our desktop support team. Trust me - I'll be advocating for GPO to push out common mapped drives once this is all over.
Any feedback is greatly appreciated. Thank you!
If anyone out there is interested, this is what I ended up with, and it worked like a charm. Thought I'd share...
$OldServer = '\\denvernas01'
$NewServer = '\\Domain.com\DEN'
$DriveLetterArray = "A","B","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"
foreach ($DriveLetter in $DriveLetterArray){
$DrivePath = $null; $ConvertedPath = $null
$DrivePath = Get-ItemPropertyValue -Path "HKCU:\Network\$DriveLetter" -Name "RemotePath" -ErrorAction SilentlyContinue
if ($DrivePath -eq $null) {continue}
#Replace Old Drive Path
if ($DrivePath -like "*${OldServer}*") {
$ConvertedPath = $DrivePath -ireplace [regex]::Escape("$Oldserver"), $NewServer
$ConvertedPath
Set-ItemProperty -Path "HKCU:\Network\$DriveLetter" -Name "RemotePath" -Value "${ConvertedPath}"
} else {
#Write-Host "no match"
continue
}
#Write-Host ""
}
#Remove previous mountpoints
Get-ChildItem HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2\* | Where-Object Name -Match "##denvernas" | Remove-Item

Powershell - Get desktop path for all users [OneDrive Sync on some of them]

I am trying to search all user's desktops for a particular shortcut and I find difficulties enumerate all desktop paths for different users on the computer as some of them have OneDrive sync and the standard path c:\Users\%user%\Desktop is not to be found.
I have tried getting the path with the GetFolderPath which only returns the path to the current user:
[System.Environment]::GetFolderPath("Desktop")
So briefly the path scenarios are:
C:\users\username\Desktop
C:\users\username\One Drive - Company\Desktop
I would be glad if somebody has a hint how to find all paths in this mixed environment.
Here's my older training script. I don't know how (or if any) it works with OneDrive cync as I have OneDrive disabled (or even uninstalled) because I found it extremely irritating…
Remove-Variable path -ErrorAction SilentlyContinue
Write-Verbose "--- Special Folders ---" -Verbose
$SpecialFolders = #{}
$names = [Environment+SpecialFolder]::GetNames( [Environment+SpecialFolder])
ForEach ($name in $names) {
# assign and then check
if( $path = [Environment]::GetFolderPath($name) ){
$SpecialFolders[$name] = $path
} else {
Write-Warning $name
$SpecialFolders[$name] = ''
}
}
$SpecialFolders.GetEnumerator() |
Sort-Object -Property name #| Format-Table -AutoSize
"---"
###Pause
$ShellFolders=#{}
Write-Verbose "--- Shell Folders ---" -Verbose
[System.Enum]::GetValues([System.Environment+SpecialFolder]) |
ForEach-Object {
$ShellFolders[$_.ToString()] =
($_.value__, [System.Environment]::GetFolderPath($_))
}
$ShellFolders.GetEnumerator() |
Sort-Object -Property name # | Format-Table -AutoSize

How to use an array in a zip function using powershell?

I am still pretty new to scripting and "programming" at all. if you miss any information here let me know.
This is my working zip function:
$folder = "C:\zipthis\"
$destinationFilePath = "C:\_archive\zipped"
function create-7zip{
param([string] $folder,
[String] $destinationFilePath)
write-host $folder $destinationFilePath
[string]$pathToZipExe = "C:\Program Files (x86)\7-Zip\7zG.exe";
[Array]$arguments = "a", "-tzip", "$destinationFilePath", "$folder";
& $pathToZipExe $arguments;
}
Get-ChildItem $folder | ? { $_.PSIsContainer} | % {
write-host $_.BaseName $_.Name;
$dest= [System.String]::Concat($destPath,$_.Name,".zip");
(create-7zip $_.FullName $dest)
}
create-7zip $folder $destinationFilePath
now I want him to zip special folders which I already sorted out :
get-childitem "C:\zipme\" | where-Object {$_.name -eq "www" -or $_.name -eq "sql" -or $_.name -eq "services"}
This small function finds the 3 folders I need called www, sql and services. But I didn't manage to insert this into my zip function, so that exactly this folders are zipped and put into C:\_archive\zipped
Because a string is used instead of an array, he tried always to look for a folder called wwwsqlservice which is not there. I tried to put an array using #(www,sql,services) but i had no success, so whats the right way, if there is one?
It should compatible with powershell 2.0, no ps3.0 cmdlets or functions please.
thanks in advance!
Here's a really simple example of what you want to do, removed from the context of your function. It assumes that your destination folders already exist (You can just use Test-Path and New-Item to create them if they don't), and that you're using 7z.exe.
$directories = #("www","sql","services")
$archiveType = "-tzip"
foreach($dir in $directories)
{
# Use $dir to update the destination each loop to prevent overwrites!
$sourceFilePath = "mySourcePath\$dir"
$destinationFilePath = "myTargetPath\$dir"
cmd /c "$pathToZipExe a $archiveType $destinationFilePath $sourceFilePath"
}
Overall it looks like you got pretty close to a solution, with some minor changes needed to support the foreach loop. If you're confident that create-7zip works fine for a single folder, you can substitute that for the cmd /c line above. Here's a listing of some handy example usages for 7zip on the command line.

How to retrieve a recursive directory and file list from PowerShell excluding some files and folders?

I want to write a PowerShell script that will recursively search a directory, but exclude specified files (for example, *.log, and myFile.txt), and also exclude specified directories, and their contents (for example, myDir and all files and folders below myDir).
I have been working with the Get-ChildItem CmdLet, and the Where-Object CmdLet, but I cannot seem to get this exact behavior.
I like Keith Hill's answer except it has a bug that prevents it from recursing past two levels. These commands manifest the bug:
New-Item level1/level2/level3/level4/foobar.txt -Force -ItemType file
cd level1
GetFiles . xyz | % { $_.fullname }
With Hill's original code you get this:
...\level1\level2
...\level1\level2\level3
Here is a corrected, and slightly refactored, version:
function GetFiles($path = $pwd, [string[]]$exclude)
{
foreach ($item in Get-ChildItem $path)
{
if ($exclude | Where {$item -like $_}) { continue }
$item
if (Test-Path $item.FullName -PathType Container)
{
GetFiles $item.FullName $exclude
}
}
}
With that bug fix in place you get this corrected output:
...\level1\level2
...\level1\level2\level3
...\level1\level2\level3\level4
...\level1\level2\level3\level4\foobar.txt
I also like ajk's answer for conciseness though, as he points out, it is less efficient. The reason it is less efficient, by the way, is because Hill's algorithm stops traversing a subtree when it finds a prune target while ajk's continues. But ajk's answer also suffers from a flaw, one I call the ancestor trap. Consider a path such as this that includes the same path component (i.e. subdir2) twice:
\usr\testdir\subdir2\child\grandchild\subdir2\doc
Set your location somewhere in between, e.g. cd \usr\testdir\subdir2\child, then run ajk's algorithm to filter out the lower subdir2 and you will get no output at all, i.e. it filters out everything because of the presence of subdir2 higher in the path. This is a corner case, though, and not likely to be hit often, so I would not rule out ajk's solution due to this one issue.
Nonetheless, I offer here a third alternative, one that does not have either of the above two bugs. Here is the basic algorithm, complete with a convenience definition for the path or paths to prune--you need only modify $excludeList to your own set of targets to use it:
$excludeList = #("stuff","bin","obj*")
Get-ChildItem -Recurse | % {
$pathParts = $_.FullName.substring($pwd.path.Length + 1).split("\");
if ( ! ($excludeList | where { $pathParts -like $_ } ) ) { $_ }
}
My algorithm is reasonably concise but, like ajk's, it is less efficient than Hill's (for the same reason: it does not stop traversing subtrees at prune targets). However, my code has an important advantage over Hill's--it can pipeline! It is therefore amenable to fit into a filter chain to make a custom version of Get-ChildItem while Hill's recursive algorithm, through no fault of its own, cannot. ajk's algorithm can be adapted to pipeline use as well, but specifying the item or items to exclude is not as clean, being embedded in a regular expression rather than a simple list of items that I have used.
I have packaged my tree pruning code into an enhanced version of Get-ChildItem. Aside from my rather unimaginative name--Get-EnhancedChildItem--I am excited about it and have included it in my open source Powershell library. It includes several other new capabilities besides tree pruning. Furthermore, the code is designed to be extensible: if you want to add a new filtering capability, it is straightforward to do. Essentially, Get-ChildItem is called first, and pipelined into each successive filter that you activate via command parameters. Thus something like this...
Get-EnhancedChildItem –Recurse –Force –Svn
–Exclude *.txt –ExcludeTree doc*,man -FullName -Verbose
... is converted internally into this:
Get-ChildItem | FilterExcludeTree | FilterSvn | FilterFullName
Each filter must conform to certain rules: accepting FileInfo and DirectoryInfo objects as inputs, generating the same as outputs, and using stdin and stdout so it may be inserted in a pipeline. Here is the same code refactored to fit these rules:
filter FilterExcludeTree()
{
$target = $_
Coalesce-Args $Path "." | % {
$canonicalPath = (Get-Item $_).FullName
if ($target.FullName.StartsWith($canonicalPath)) {
$pathParts = $target.FullName.substring($canonicalPath.Length + 1).split("\");
if ( ! ($excludeList | where { $pathParts -like $_ } ) ) { $target }
}
}
}
The only additional piece here is the Coalesce-Args function (found in this post by Keith Dahlby), which merely sends the current directory down the pipe in the event that the invocation did not specify any paths.
Because this answer is getting somewhat lengthy, rather than go into further detail about this filter, I refer the interested reader to my recently published article on Simple-Talk.com entitled Practical PowerShell: Pruning File Trees and Extending Cmdlets where I discuss Get-EnhancedChildItem at even greater length. One last thing I will mention, though, is another function in my open source library, New-FileTree, that lets you generate a dummy file tree for testing purposes so you can exercise any of the above algorithms. And when you are experimenting with any of these, I recommend piping to % { $_.fullname } as I did in the very first code fragment for more useful output to examine.
The Get-ChildItem cmdlet has an -Exclude parameter that is tempting to use but it doesn't work for filtering out entire directories from what I can tell. Try something like this:
function GetFiles($path = $pwd, [string[]]$exclude)
{
foreach ($item in Get-ChildItem $path)
{
if ($exclude | Where {$item -like $_}) { continue }
if (Test-Path $item.FullName -PathType Container)
{
$item
GetFiles $item.FullName $exclude
}
else
{
$item
}
}
}
Here's another option, which is less efficient but more concise. It's how I generally handle this sort of problem:
Get-ChildItem -Recurse .\targetdir -Exclude *.log |
Where-Object { $_.FullName -notmatch '\\excludedir($|\\)' }
The \\excludedir($|\\)' expression allows you to exclude the directory and its contents at the same time.
Update: Please check the excellent answer from msorens for an edge case flaw with this approach, and a much more fleshed out solution overall.
Recently, I explored the possibilities to parameterize the folder to scan through and the place where the result of recursive scan will be stored. At the end, I also did summarize the number of folders scanned and number of files inside as well. Sharing it with community in case it may help other developers.
##Script Starts
#read folder to scan and file location to be placed
$whichFolder = Read-Host -Prompt 'Which folder to Scan?'
$whereToPlaceReport = Read-Host -Prompt 'Where to place Report'
$totalFolders = 1
$totalFiles = 0
Write-Host "Process started..."
#IMP separator ? : used as a file in window cannot contain this special character in the file name
#Get Foldernames into Variable for ForEach Loop
$DFSFolders = get-childitem -path $whichFolder | where-object {$_.Psiscontainer -eq "True"} |select-object name ,fullName
#Below Logic for Main Folder
$mainFiles = get-childitem -path "C:\Users\User\Desktop" -file
("Folder Path" + "?" + "Folder Name" + "?" + "File Name " + "?"+ "File Length" )| out-file "$whereToPlaceReport\Report.csv" -Append
#Loop through folders in main Directory
foreach($file in $mainFiles)
{
$totalFiles = $totalFiles + 1
("C:\Users\User\Desktop" + "?" + "Main Folder" + "?"+ $file.name + "?" + $file.length ) | out-file "$whereToPlaceReport\Report.csv" -Append
}
foreach ($DFSfolder in $DFSfolders)
{
#write the folder name in begining
$totalFolders = $totalFolders + 1
write-host " Reading folder C:\Users\User\Desktop\$($DFSfolder.name)"
#$DFSfolder.fullName | out-file "C:\Users\User\Desktop\PoC powershell\ok2.csv" -Append
#For Each Folder obtain objects in a specified directory, recurse then filter for .sft file type, obtain the filename, then group, sort and eventually show the file name and total incidences of it.
$files = get-childitem -path "$whichFolder\$($DFSfolder.name)" -recurse
foreach($file in $files)
{
$totalFiles = $totalFiles + 1
($DFSfolder.fullName + "?" + $DFSfolder.name + "?"+ $file.name + "?" + $file.length ) | out-file "$whereToPlaceReport\Report.csv" -Append
}
}
# If running in the console, wait for input before closing.
if ($Host.Name -eq "ConsoleHost")
{
Write-Host ""
Write-Host ""
Write-Host ""
Write-Host " **Summary**" -ForegroundColor Red
Write-Host " ------------" -ForegroundColor Red
Write-Host " Total Folders Scanned = $totalFolders " -ForegroundColor Green
Write-Host " Total Files Scanned = $totalFiles " -ForegroundColor Green
Write-Host ""
Write-Host ""
Write-Host "I have done my Job,Press any key to exit" -ForegroundColor white
$Host.UI.RawUI.FlushInputBuffer() # Make sure buffered input doesn't "press a key" and skip the ReadKey().
$Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyUp") > $null
}
##Output
##Bat Code to run above powershell command
#ECHO OFF
SET ThisScriptsDirectory=%~dp0
SET PowerShellScriptPath=%ThisScriptsDirectory%MyPowerShellScript.ps1
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Bypass -File ""%PowerShellScriptPath%""' -Verb RunAs}";
A bit late, but try this one.
function Set-Files($Path) {
if(Test-Path $Path -PathType Leaf) {
# Do any logic on file
Write-Host $Path
return
}
if(Test-Path $path -PathType Container) {
# Do any logic on folder use exclude on get-childitem
# cycle again
Get-ChildItem -Path $path | foreach { Set-Files -Path $_.FullName }
}
}
# call
Set-Files -Path 'D:\myFolder'
Commenting here as this seems to be the most popular answer on the subject for searching for files whilst excluding certain directories in powershell.
To avoid issues with post filtering of results (i.e. avoiding permission issues etc), I only needed to filter out top level directories and that is all this example is based on, so whilst this example doesn't filter child directory names, it could very easily be made recursive to support this, if you were so inclined.
Quick breakdown of how the snippet works
$folders << Uses Get-Childitem to query the file system and perform folder exclusion
$file << The pattern of the file I am looking for
foreach << Iterates the $folders variable performing a recursive search using the Get-Childitem command
$folders = Get-ChildItem -Path C:\ -Directory -Name -Exclude Folder1,"Folder 2"
$file = "*filenametosearchfor*.extension"
foreach ($folder in $folders) {
Get-Childitem -Path "C:/$folder" -Recurse -Filter $file | ForEach-Object { Write-Output $_.FullName }
}

Using PowerShell to move a matching set of files with same name but different extensions to two different destinations

I would like to use PowerShell to move a matching name set of files (1 job file and 1 trigger file both havening the same name just different extensions) from one directory to another. See example below.
Source directory contains job1.zip, job1.trg, job2.zip, and job2.trg. I would like to take matching job names job1.zip and job1.trg and move it to dest1folder, only if it is empty, if not move it to dest2folder. Then loop back to perform the same logic for job2.zip and job2.trg. One thing I also have to take into consideration is the Source directory may only contain job1.zip waiting for job1.trg to be transferred. I am a newbie to PowerShell and blown hours on trying to get it working with no success. Is it possible?
This is what I have so far. I get the files to move to each destination folder using IF logic, but it moves all of the files in the source directory.
$doirun = (get-childItem "d:\ftproot\pstest\").Count
$filecount = (get-childItem "d:\ftproot\ps2\").Count
if ($doirun -le 1) {exit}
$dir = get-childitem "d:\ftproot\pstest\" | Where-Object {($_.extension -eq ".zip") -or ($_.extension -eq ".trg")}
foreach ($file in $dir)
{
if ($filecount -le 2) {Move-item "d:\ftproot\pstest\$file" "d:\ftproot\ps2\"}
else {Move-item "d:\ftproot\pstest\$file" "d:\ftproot\ps3\"}
}
Not tested extensively, but I believe this should work:
$jobs = gci d:\ftproot\pstest\* -include *.zip,*.trg |
select -expand basename | sort -unique
$jobs |foreach-object {
if (test-path d:\ftproot\pstest\$_.zip -and test-path d:\ftproot\pstest\$_.trg){
if (test-path d:\ftproot\pstest\ps2\*){
move-item d:\ftproot\pstest\$_.zip d:\ftproot\pstest\ps3
move-item d:\ftproot\pstest\$_.trg d:\ftproot\pstest\ps3
}
else {
move-item d:\ftproot\pstest\$_.zip d:\ftproot\pstest\ps2
move-item d:\ftproot\pstest\$_.trg d:\ftproot\pstest\ps2
}
}