I apologize in advance of the long post. I have spent a significant amount of time trying to find an answer or piece together a solution to this problem.
It's a simple request: Add a user/group with 2 sets of permissions to your entire DFS environment applying one set to folders and sub-folders and the other to files only.
Seems easy enough, however in the environment I'm trying to manage we have 1000's of folder paths greater than 260 characters deep and any use of dir -recurse or get-childitem will result in hitting the error "pathtoolong". Every example solution for this problem has used a variation of the above or relies on "get-childitem". This fails for most real world situations as I believe many of us IT admins are faced with long paths due to the nature of DFS use.
The current attempt:
Currently I'm using a custom module "NTFSSecurity" which is fantastic to apply NTFS permissions. It works great!
It can be found here: https://ntfssecurity.codeplex.com/
A tutorial from here: https://blogs.technet.microsoft.com/fieldcoding/2014/12/05/ntfssecurity-tutorial-1-getting-adding-and-removing-permissions/
The problem found in the above tutorial and every other example I've been able to find, it references commands such as:
dir -Recurse | Get-NTFSAccess -Account $account
This will fail in the real world of super long file paths.
The "PathTooLong" error workaround:
My workaround current consists of using Robocopy to export the file paths to a text file. I found this as a recommendation from someone dealing with a similar problem. Robocopy will not error on "pathtoolong" issues and is perfect for this exercise. I then try and run commands against the text file containing all of the paths I need to modify.
The command for the Robocopy is this:
robocopy '<insert source path here>' NULL /NJH /E /COPYALL /XF *.* | Out-File -FilePath '<path to fileout.txt>'
This will create a text file while copying only folder structure and permissions. Excellent!
You will then have to clean up the text file from additional characters which I use:
$filepath = '<path>\DFS_Folder_Structure.txt'
$structure = Get-Content $filepath
$structure -replace ' New Dir 0 '| Out-File -FilePath \\<path_you_want_file>\DFS_Folder_Structure2.txt
I also reversed the contents of the text file so it shows the furthest child object (folder) and work down. I thought this might be easier for identifying a parent folder or some other recursive logic which I haven't been able to figure out.
To reverse text from bottom to top use this command here:
$x = Get-Content -Path 'C:\temp_dfs\DFS_Folder_Structure2.txt'; Set-Content -Path 'C:\temp_dfs\Reversed_data.txt' -Value ($x[($x.Length-1)..0])
This script currently only applies to paths with Inheritance off or for childobjects with Inheritance off. This is taken from the NTFSSecurity module command Get-NTFSInheritance which will return results for AccessInheritance and AuditInheritance.Access is if the folder is inheriting from a parent above. Audit is if the folder is passing it down to child objects.
There are 4 possibilities:
AccessInheritance True AuditInheritance True
AccessInheritance True AuditInheritance False
AccessInheritance False AuditInheritance True
AccessInheritance False AuditInheritance False
(*Special note: I have seen all 4 show up in the DFS structure I'm dealing with.)
Script to Set Permissions based on file path contained in text file:
#Get File content to evaluate
$path = Get-Content 'C:\temp_dfs\Reversed_data.txt'
$ADaccount = '<insert fully qualified domain\user or group etc.>'
Foreach ($line in $path)
{
#Get-NTFSAccess -Path $line -Account $ADaccount | Remove-NTFSAccess
#This command will find the access of an account and then remove it.
#It has been omitted but included in case needed later.
$result = Get-NTFSInheritance -Path $line
If ($result.AccessInheritanceEnabled -Match "False" -and $result.AuditInheritanceEnabled -match "False")
{
Add-NTFSAccess -Path $line -Account $ADaccount -AccessRights Traverse,ExecuteFile,ListDirectory,ReadData,ReadAttributes,ReadExtendedAttributes,ReadPermissions -AppliesTo ThisFolderAndSubfolders
Add-NTFSAccess -Path $line -Account $ADaccount -AccessRights ReadAttributes,ReadExtendedAttributes,ReadPermissions -AppliesTo FilesOnly
}
If ($result.AccessInheritanceEnabled -Match "False" -and $result.AuditInheritanceEnabled -Match "True")
{
Add-NTFSAccess -Path $line -Account $ADaccount -AccessRights Traverse,ExecuteFile,ListDirectory,ReadData,ReadAttributes,ReadExtendedAttributes,ReadPermissions -AppliesTo ThisFolderAndSubfolders
Add-NTFSAccess -Path $line -Account $ADaccount -AccessRights ReadAttributes,ReadExtendedAttributes,ReadPermissions -AppliesTo FilesOnly
}
If ($result.AccessInheritanceEnabled -Match "True" -and $result.AuditInheritanceEnabled -Match "False")
{
continue
}
If ($result.AccessInheritanceEnabled -Match "True" -and $result.AuditInheritanceEnabled -Match "True")
{
continue
}
}
This script will apply permissions for the specified User/Group account and set permissions for Folder and Sub-folders and then add another set of permissions to Files only.
Now this current fix works great except it only touches folders with Inheritance turned off. This means you'd need to run this script and then set permissions on the "main parent folder". This is completely do-able and may be the best method to avoid double entries of permissions and is the current state of my solution.
If you add criteria to the bottom sections where AccessInheritanceEnable = True and Audit = True you will get double entries because you're applying permissions to both the parent --> which pushes its permissions to the child-objects and also explicitly on the child-objects themselves. This is due to the text file contains both parent and child and I haven't figure out a way to address that. This isn't "horrible" but my OCD doesn't like to have double permissions added if it can be avoided.
The real question:
What I'd really like to do is somehow identify parent folders, compare them to parents further up the tree and see if it was inheriting permissions and only apply the permission set to the highest parent in a specific chain. My mind wants to explode thinking about how you would compare parents and find the "highest parent".
Again the problem being anytime you want to -recurse the folder structure it will fail due to "pathtoolong" issues so it needs to be contained to logic applied to the text file paths. I've seen a bit mentioned about split-path but I don't really understand how that's applied or how you could compare a path to another path until you identified a parent path.
Thank-you for taking the time to read this long post and question. If you're still awake now, I'm open to any suggestions. lol.
the NTFSSecurity module is indeed fantastic.
I used it to make a script that can export ntfs security of a UNC path and it's subfolders to a readable excel file.
It can be found on:
https://github.com/tgoetheyn/Export-NtfsSecurity
I use is frequently and didn't had any problem with long filenames.
Hope you like it.
PS: If you add ntfssecurity, don't forget to include the "Synchronize" permission. If not included, strange things can happen.
Related
I am newbie to powershell scripting and I did go through lot of articles here and not able to get much help.Can someone help me out on how to put files in specific folders:
file name example:
2008_11_chan_3748_NB001052_031_SIGNED.pdf
put it in folders:
2008/11/NB001052-031/....
if it ends with "draft", e.g.
2008_11_chan_3748_NB001052_031_Draft.pdf
put it in a separate draft folder as below
2008/11/NB001052-031/Draft
Any help is really appreciated.
Thanks in advance.
Push-Location c:\2009
Get-ChildItem -File -Filter *_*_*_*_*_*.pdf |
Move-Item -Destination {
$tokens = $_.Name -split '_'
$subdirNames = $tokens[0,1,4]
if ($tokens[-1] -like 'Draft.*') { $subdirNames += 'Draft' }
(New-Item -Force -Type Directory ($subdirNames -join '\')).FullName
} -WhatIf
Pop-Location
Note: The -WhatIf common parameter in the command above previews the move operation, but the target subfolders are created right away in this case.
Remove -WhatIf once you're sure the operation will do what you want.
(Partial) explanation:
The Get-ChildItem call gets those files in the current directory whose name matches wildcard pattern *_*_*_*_*_*.pdf.
Move-Item -Destination uses a delay-bind script block ({ ... }) to dynamically determine the target path based on the current input object, $_, which is of type System.IO.FileInfo
The code inside the script block uses the -split operator to split the file name into tokens by _, extracts the tokens of interest and appends Draft as appropriate, then creates / returns a subdirectory path based on the \-joined tokens joined in order to output the target directory path; note that -Force creates the directory on demand and quietly returns an existing directory.
Looking for some Powershell help with a copying challenge.
I need to copy all MS Office files from a fairly large NAS (over 4 million of them and a little over 5tb) to another drive, retaining the existing folder structure where a file is copied.
I have a text file of all the common Office file types (about 40 of them) - extns.txt
At this stage, being a good StackExchanger, I'd post the script I've got so far, but I've spent best part of a day on this and, not only is what I've got embarrassingly awful, I suspect that even the basic algorithm is wrong.
I started to gci the entire tree on the old NAS, once for each file type
Then I thought it would be better to traverse once and compare every file to the list of valid types.
Then I got into a complete mess about rebuilding the folder structure. I started by splitting on '\' and iterating through the path then wasted an hour of searching because I thought I remembered reading about a simple way to duplicate a path if it doesn't exist.
Another alternative is that I dump out a 4 million line text file of all the files (with full path) I want to copy (this is easy as I imported the entire structure into SQL Server to analyse what was there) and use that as a list of sources
I'm not expecting a 'please write the codez for me' answer but some pointers/thoughts on the best way to approach this would be appreciated.
I'm not sure if this is the best approach, but the below script is a passable solution to the least.
$sourceRootPath = "D:\Source"
$DestFolderPath = "E:\Dest"
$extensions = Get-Content "D:\extns.txt"
# Prefix "*." to items in $extensions if it doesn't already have it
$extensions = $extensions -replace "^\*.|^","*."
$copiedItemsList = New-Object System.Collections.ArrayList
foreach ( $ext in $extensions ) {
$copiedItems = Copy-Item -Path $sourceRootPath -Filter $ext -Destination $DestFolderPath -Container -Recurse -PassThru
$copiedItems | % { $copiedItemsList.Add($_) | Out-Null }
}
$copiedItemsList = $copiedItemsList | select -Unique
# Remove Empty 'Deletable' folders that get created while maintaining the folder structure with Copy-Item cmdlet's Container switch
While ( $DeletableFolders = $copiedItemsList | ? { ((Test-Path $_) -and $_.PSIsContainer -eq $true -and ((gci $_ | select -first 1).Count -eq 0)) } ) {
$DeletableFolders | Remove-Item -Confirm:$false
}
The Copy-Item's -Container switch is going to preserve the folder structure for us. However, we may encounter empty folders with this approach.
So, I'm using an arraylist named $copiedItemsList to add the copied objects into, which I will later use to determine empty 'Deletable' folders which are then removed at the end of the script.
Hope this helps!
So I have been tasked to write a script that will move files from one folder to another folder, which is easy enough. The problem I am having is the files are for accounts so there will be a file called DEA05292020.pdf and another file called TENSJ05292020 and each file needs to go to a specific folder (EX. the DEA05292020.pdf file needs to be moved to a folder called DEA and the TENSJ05292020 will move to the TENSJ folder. There are over a hundred different accounts that have their own specific folder. The files all start off in our Recon folder and need to be moved at the end of each month to their respective accounts folder. So my question is how could I go about creating a powershell script to make that happen. I am very new to powershell and have been studying the "Learn Powershell in a Month of Lunches" and have a basic grasp of it. So what I have so far is very simple where I can copy the file over to the new folder:
copy-item -path "\Sageshare\share\Reconciliation\PDF Recon Center\DEA RECON 05292020" -destination "Sageshare\share\Account Rec. Sheets\Seperate Accounts\DEA"
This works but I need a lot more automation in regards to seperating all the different account names in the PDF Recon Center folder. How do I make a script that can filter the account name (IE: DEA) and also the month and year from the name of the file (IE: 052020 pulled out of the 05292020 part of the filename)?
Thanks!
If #Lee_Dailey wants to write the code and post it here, I'll delete my answer. He solved the problem, I just code monkeyed it.
Please don't test on everything at once, run it in batches so you can monitor its behavior and not mess up your environment. It moves files in ways you may not want, i.e. if there is a folder named a it'll move everything that matches that folder into it. If you want to prevent this you can write the prescanning if there is a folder more "closely matching" that name before it actually creates the folder itself. Pretty sure it does everything you want however in the simplest way to understand. :)
$names = $(gci -af).name |
ForEach-Object {
if (-not ($_.Contains(".git"))){
$_
}
}
if ( $null -eq $names ) {
Write-Host "No files to move!"
Start-Sleep 5
Exit
}
$removedNames = $names |
ForEach-Object {
$_ = $_.substring(0, $_.IndexOf('.')) # Remove extension
$_ -replace '[^a-zA-Z-]','' # Regex removes numbers
}
$removedNames = $removedNames |
Get-Unique # Get unique folder names
$names |
ForEach-Object {
$name = $_
$removedNames |
ForEach-Object {
if ($name.Contains($_)) # If it matches a name
{
if (-not (Test-Path ".\$_")) { # If it doesn't see the folder
New-Item -Path ".\" `
-Name "$_" `
-ItemType "directory"
}
Move-Item -Path ".\$name" `
-Destination ".\$_" # Move file to folder
}
}
}
I need to search for a string and then replace it with another in multiple files. Sound easy, but the hard part is that is that it's multiple files on multiple network locations. I've tried connecting to all of the locations at once with vscode and then using the built-in search and replace function. This allmost works, except when I get to big searches is seems to hang.
I'm now looking for another, more stable, way to do this. Anybody got any ideas? I thought powershell could be a good competitor, but unfortunately I'm not that used to working with powershell.
I found this guide and it's a bit like what I want, except I need to do it on multiple files at multiple locations at once.
https://mcpmag.com/articles/2018/08/08/replace-text-with-powershell.aspx
I would settle with running one skript for each location since it's only < 20 locations to scan. But it needs to include subfolders.
Any tips are appreciated, thanks! :)
Edit 1:
The folder structure differs from location to location so its hard to say how it looks. But I can say that no location has a folder structure deeper than 15 steps. The text that I'm replacing are thumbprints of certificates stored in .config files. The files are between 100 and 1000 characters long and the thumbprints I'm replacing looks something like this d2e8c58e5b34021671f2121483572f03f54ab9ae
This is assuming that the different network locations are in trusted domains or at least part of the wmi trustedhosts. PowerShell remoting will also need to be enabled on all computers involved. Run (In elevated PowerShell) Enable-PSRemoting -Force to enable PowerShell Remoting
$command = { Get-ChildItem -Path C:\Test\ -Include *.config -Name -Recurse | ForEach-Object {$configContent = Get-Content -Path $_ -Raw; $configContent.Replace("Old Value", "New Value") | Out-File -FilePath ($_.FullName) -Force } }
Invoke-Command -ComputerName "TestServer1", "TestServer2", "etc..." -ScriptBlock $command
If you are not part of the domain but have a domain/server login, you will need to use the -Credentials switch on the Invoke-Command function. This will basically find all files that have the .config extension in any subfolders in the path, get the current content of the .config file, replace your value, and finally overwrite the existing config file. WATCH OUT THOUGH this will get EVERY .config file that is in that path. If you have more than one it will also grab it, but if it doesn't have the string it will just rewrite the same file.
Without seeing an example of the folder structures and files this is quite hard to give a thorough answer on. However I would probably build a series of ForEach segments. For example:
ForEach ($Server in $Servers)
{
ForEach ($File in $Files)
{
Select-String -Path $File -Pattern "$ExampleString"
}
}
I have two directories:
C:\G\admin\less
C:\G\user\less
Inside of those directives I have multiple less and css files. I know all the names of the files so I would like to hardcode a list of these into the script. Perhaps this could be in an array or something but my knowledge of PowerShell is not even enough to know if there are arrays in the scripting language.
C:\G\admin\less
html-light.less
html-light.css
html-dark.less
html-dark.css
do-not-track-me.less
C:\G\user\less
html-light.less
html-light.css
html-dark.less
html-dairk.css
do-not-track-me.less
Is there a way I can use PowerShell to check each of these files (that I want to hardcode in my program) one by one and copy the last modified file from its directory to the other directory so that both directories will contain the same latest versions of these files?
Note that I would need to evaluate the predefined list of files one by one. Comparing modified date in one directory with the other and copying over as needed.
again assume that this isn't the best solution or approach
This solution assumes following
- when the LastWriteTime on one folder is bigger than the other it copy it to another folder.
- I'm not doing the path validation because of laziness but if you want with path validation just ask.
- I'm assuming that all the files on those folder must be tracked otherwise read the comment on code.
- i suggest you backup your folder before you run the script.
#if there is a file you don't want to track on those folder (for example you don't want to track txt files)
#just write $userFiles=dir C:\G\user\less\ -Exclude "*.txt"
#if you don't want track txt but only one of them should be track among with other file format
#$userFiles=dir C:\G\user\less\ -Exclude "*.txt" -Include "C:\G\user\less\Txtadditionaltotrack.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
foreach($userfile in $userFiles)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
you can improve it because the way this code is it always copy file from one directory to another inside the else you can validate so that when the lastwriteTime is equal on both it doesn't copy.
You can improve it in many ways. i hope you got the ideia
Find the modification made to code so that it can archieve this requirement.
PLEASE READ THE COMMENT IN CODE.
NOTE THAT I'M NOT FOLLOWING THE BEST PRATICE (avoid unexpected error, name correctly all variable, ...)
#to make it more dynamical you can save on one file
#all the file names including extension in different lines.
#For example on path C:\FilesToWatch\watcher.txt
#$filestowatch=get-content C:\FilesToWatch\watcher.txt
$filestowatch="felicio.txt","marcos.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
#Optionally instead of use this if approach you can
#$userFiles=dir C:\G\user\less\ |? {$filestowatch -contains $_.Name}
#$adminfiles=dir C:\G\admin\less\|? {$filestowatch -contains $_.Name}
#loading in the above manner the first if statement on code bellow can be removed because
#We make sure that $userFiles and $adminfiles only have correct file to monitor
foreach($userfile in $userFiles)
{
if($filestowatch -contains $userfile.Name)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
}
I think what you need is symbolic links, aka symlinks.
With symlinks, you can define files and folders that will be always in-sync, where the target file is updated automatically when the original is modified.
To create a symbolic link, enter the following in the console/command prompt:
mklink /[command] [link path] [file or folder path]
Mklink can create several types of links, according to these commands:
/D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows. This is the default option, and mklink will use it if you do not enter a command.
/H – creates a hard link to a file.
/J – creates a hard link to a folder.
The syntax is simple. Choose your option, define the path you want for the symlink, and finally the path of the original file/folder.
For example, imagine I'm developing a new project, and I want to share it to my client via Dropbox shared folder. I don't want to move all my workspace to dropbox, I just want to share that specific folder to them:
mklink /J C:\Dropbox\clients_shared_folders\project_x C:\my_dev_rootfolder\project_x
Note that the first path is the symbolic folder I want to create, while the second path is the existing directory.
In you case, I'll be assuming your working on the admin folder, and want to generate a syncd copy on the user folder:
mklink /J C:\G\user\less C:\G\admin\less
Here's a nice article for more info:
http://www.howtogeek.com/howto/16226/complete-guide-to-symbolic-links-symlinks-on-windows-or-linux/