We're trying to compare NTFS permissions for files or folders using the SDDL attribute. The only thing we're interested in is if the ACL is equal or not, by using the SDDL and not other methods like AccessToString or just comparing two plain ACL objects. This is because we had issues in the past with the standard way of doing this.
So, we now run against an issue where File1 and File2 have exactly the same permissions when checking the Advanced Permissions tab in Windows. However, the SDDL says it's not equal, although we take away the Owner O: part from the SDDL string as indicated here, as the owner doesn't interest us.
The code:
Function Test-ACLequal {
Param (
$Source,
$Target
)
$CompParams = #{
ReferenceObject = Get-Acl -LiteralPath $Source
PassThru = $True
}
$CompParams.DifferenceObject = Get-Acl -LiteralPath $Target
$AccessParams = #{
ReferenceObject = ($CompParams.ReferenceObject.sddl -split 'G:', 2 | Select -Last 1)
DifferenceObject = ($CompParams.DifferenceObject.sddl -split 'G:', 2 | Select -Last 1)
PassThru = $True
}
if (Compare-Object #AccessParams) {
Write-Verbose 'Test-ACLequalHC: Not equal'
$false
}
else {
Write-Verbose 'Test-ACLequalHC: Equal'
$True
}
}
Test-ACLequal -Source $File1-Target $File2
You can clearly see there is a difference between both files:
$AccessParams.ReferenceObject
DUD:(A;ID;FA;;;BA)(A;ID;0x1200a9;;;S-1-5-21-1078081533-261478967-839522115-243052)(A;ID;0x1301ff;;;S-1
-5-21-1078081533-261478967-839522115-280880)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-6
96733)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-696745)
$AccessParams.DifferenceObject
DUD:AI(A;ID;FA;;;BA)(A;ID;0x1200a9;;;S-1-5-21-1078081533-261478967-839522115-243052)(A;ID;0x1301ff;;;S
-1-5-21-1078081533-261478967-839522115-280880)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115
-696733)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-696745)
Is there a way to compare files by using the SDDL without running into this issue?
Does using .Equals work for you here?
$sourceAcl = Get-Acl $source
$targetAcl = Get-Acl $target
if ($sourceAcl.sddl.Equals($targetAcl.sddl)) {
# Do something
....
}
This includes the owner however. In your example where you're removing it, you're also converting the object to a string, so using Compare-Object isn't really necessary. I'm also not sure how safe the split you're using is. You could also do:
$sourceAcl = Get-Acl $source
$targetAcl = Get-Acl $target
$s = $sourceAcl.sddl -replace "^O:[^:]+:",""
$t = $targetAcl.sddl -replace "^O:[^:]+:",""
if ($s -eq $t) {
# Do something
....
}
Related
I am trying to create a reusable log function for a project I am working on in PowerShell. I am a novice with PowerShell so I am having problems figuring out why my function is not producing expected results. I am attempting to create a function that send NTFS/ACL details to a log file. This function will be incorporated in to a larger script that will change some NTFS/ACL/ACE folder rights. I have excluded some of the other code for simplification (changing rights).
Below is my current stripped down code. When it runs, it creates the log file but the file is empty. If I move the line of code that creates the log inside the log function, it creates the log file with data but it is not formatted correctly - it writes the heading (object attribute names) on one line, then the data, then then a new line with the heading, then the data. I want it to write the heading, then a line of data, line of data, .... Before I created a function for this, it worked as expected. I am a novice at PowerShell so I may not understand how to pass info in and out of the function. My Code:
#variables
$rootDir = "\\server1\share1"
$logDir = "c:\temp"
$date = (Get-Date -Format "MMddyyyy-HHmm")
$logData =#()
#My Logging Function
Function CreateLog {
#create and object to store attributes
$logObj = New-Object -TypeName psobject -Property #{
FolderPath = $Folder.Fullname
IdentityReference = $ACL.IdentityReference.ToString()
folder = $Folder[0]
FileSystemRights = $ACL.FileSystemRights.ToString()
InheritanceFlags = $ACL.InheritanceFlags.ToString()
}
$Folders=(Get-ChildItem -Path $rootDir -Directory -Recurse)
foreach ($Folder in $Folders){
$ACLs = get-acl $Folder.fullname | ForEach-Object {$_.Access}
Foreach ($ACL in $ACLs){
CreateLog
if($ACL.FileSystemRights -eq "FullControl"){
Write-Host "DO SOMETHING"
}
}
$logData | select folder,IdentityReference,FileSystemRights,InheritanceFlags,FolderPath | Format-Table -Wrap | Out-File $logDir\testlog-$date.log -Append
I assume you're looking for something like this:
#variables
$rootDir = "\\server1\share1"
$logDir = "c:\temp"
$date = Get-Date -Format "MMddyyyy-HHmm"
#My Logging Function
Function CreateLog {
param(
[Parameter(Mandatory)]
[System.Security.AccessControl.FileSystemAccessRule] $AccessRule,
[Parameter(Mandatory)]
[string] $Folder
)
[pscustomobject]#{
FolderPath = $Folder
IdentityReference = $AccessRule.IdentityReference
FileSystemRights = $AccessRule.FileSystemRights
InheritanceFlags = $AccessRule.InheritanceFlags
}
}
Get-ChildItem -Path $rootDir -Directory -Recurse | ForEach-Object {
foreach($rule in ($_ | Get-Acl).Access) {
CreateLog -AccessRule $rule -Folder $_.FullName
if($rule.FileSystemRights.HasFlag([Security.AccessControl.FileSystemRights]::FullControl)) {
# Do something here
}
}
} | Export-Csv $logDir\testlog-$date.log -NoTypeInformation
Your current code has a few syntax errors (missing }), your function seem to be assigning the objects to $logObj but then $logObj is never outputted from it, hence producing no output.
Your $logData array is defined but never used, leaving aside is not needed at all in this case. Your function should, ideally, take the arguments for the Access Rules and Path, see Functions with Parameters to learn more.
Format-Table as well as the other Format-* cmdlets should be primarily used for console display, should not be used for exporting data. In this case you should use Export-Csv, this way your data preserves its structure and can be imported back at ease, be filtered and be sorted.
i write a script with a function.
here is the script with the function:
function GenerateHashesForProjects(){
[array]$result=#()
Write-Output "Genrate Hash Values"
$dependencyFolder = Join-Path -Path $PSScriptRoot -ChildPath "..\..\Sources\_Dependencies"
#get all folder in a list below the dependency folder expect the "Modules" folder
$dependencyContent = Get-ChildItem -Path $dependencyFolder | where {$_.PSIsContainer -and ($_.Name -notlike "*Modules*")}
#Fill the result array with the project file name and the depending hash value of this file
foreach ($item in $dependencyContent) {
$denpencyProjects = Get-ChildItem -Path $item.Fullname | where { ($_ -like "*.csproj") }
$hashValue = (Get-FileHash $denpencyProjects.FullName -Algorithm MD5).Hash
$name = $denpencyProjects.Name
Write-Output "name: $name `nvalue: $hashValue"
$result += #($denpencyProjects.Name, $hashValue)
}
return $result
}
That script works fine.
Now i want to use this function also in another script. So i import the script and define a variable with that function. Here is the issue if a call the function without the variable it works fine but with the variable definition not, why?
Here is the second script with the import:
. Join-Path -Path $PSScriptroot -ChildPath "..\..\Build\Tools\GenerateHashesForProjects.ps1"
[array]$dependencyFileValues = GenerateHashesForProjects
This test works fine:
. Join-Path -Path $PSScriptroot -ChildPath "..\..\Build\Tools\GenerateHashesForProjects.ps1"
GenerateHashesForProjects
since you didn't post any responses to questions [grin], here is one way to rewrite your code.
what it does ...
creates an advanced function
uses the recommended name format for such
does not supply the "otta be there" Comment Based Help [grin]
defines the parameters
only the $Path is required.
defines but does not use a begin {} block
defines a process {} block
grabs a list of the dirs that branch from the source path
filters out the dirs that are in the $ExcludeDirList
gets the files in those dirs that match the $FileFilter
iterates thru that list
builds a [PSCustomObject] for each file with the desired details
you can add or remove them as needed.
sends that PSCO out to the calling code
the line that calls the function stores the entire set of results into the $Result variable and then shows that on screen.
a few notes ...
i had to change a lot of your details since i have no csproj files
there are no "what is happening" lines
if you need that, you can easily add such. i would NOT use Write-Output, tho, since that will pollute your output data.
there is no error detection OR error handling
here's the code ...
function Get-ProjectFileHash
{
<#
CommentBasedHelp goes here
#>
[CmdletBinding ()]
Param
(
[Parameter (
Mandatory,
Position = 0
)]
[string]
$Path,
[Parameter (
Position = 1
)]
[ValidateSet (
'MD5',
'MACTripleDES',
'RIPEMD160',
'SHA1',
'SHA256',
'SHA384',
'SHA512'
)]
[string]
$Algorithm = 'MD5',
[Parameter (
Position = 2
)]
[string[]]
$ExcludeDirList,
[Parameter (
Position = 3
)]
[string]
$FileFilter
)
begin {}
process
{
$ProjDirList = Get-ChildItem -LiteralPath $Path -Directory |
Where-Object {
# the "-Exclude" parameter of G-CI is wildly unreliable
# this avoids that problem [*grin*]
# build a regex OR listing to exclude
$_.Name -notmatch ($ExcludeDirList -join '|')
}
$FileList = Get-ChildItem -LiteralPath $ProjDirList.FullName -File -Filter $FileFilter
foreach ($FL_Item in $FileList)
{
[PSCustomObject]#{
FileName = $FL_Item.Name
DirName = $FL_Item.Directory
Algorithm = $Algorithm
Hash = (Get-FileHash -LiteralPath $FL_Item.FullName -Algorithm $Algorithm).Hash
}
}
}
end {}
} # end >>> function Get-ProjectFileHash
$Source = 'C:\ProgramData\chocolatey\lib'
$NotWanted = 'choco', '7zip', 'kb', 'bad', 'bkp'
$Filter = '*.nupkg'
$Result = Get-ProjectFileHash -Path $Source -Algorithm MD5 -ExcludeDirList $NotWanted -FileFilter $Wanted
$Result
truncated output ...
FileName DirName Algorithm Hash
-------- ------- --------- ----
autohotkey.nupkg C:\ProgramData\chocolatey\lib\autohotkey MD5 35A1B894AEA7D3473F3BBCBF5788D2D6
autohotkey.install.nupkg C:\ProgramData\chocolatey\lib\autohotkey.install MD5 EFE8AD812CBF647CFA116513AAD4CC15
autohotkey.portable.nupkg C:\ProgramData\chocolatey\lib\autohotkey.portable MD5 D31FA1B5496AAE266E4B0545835E9B19
[*...snip...*]
vcredist2015.nupkg C:\ProgramData\chocolatey\lib\vcredist2015 MD5 56321731BC0AEFCA3EE5E547A7A25D5E
vlc.nupkg C:\ProgramData\chocolatey\lib\vlc MD5 8177E24675461BDFF33639BF1D89784B
wiztree.nupkg
I have a script that copies the files from one server to another.
I have basically 3 different server locations, that I want to copy from , and create on another server, a folder for each of the location from the source and the contents inside it.
I made that, but I declared a variable for each source and each folder/destination.
I want to create one variable, and from that it should automatically get each location of the source, and copy everything in the correct location.
Would defining $server= "\path1 , \path2, \path3 " do it and it would go into a foreach loop? and go through each part of the path and copy and paste?
If so, how can I define the destination if I have 1 folder with 3 subfolders each corresponding to one source.
for example \path1 should alwasy put items in the path1destination , \path2 should always put items in the path2destination and so on. basically I want somehow to correlate that for each source path to have a specific path destination and everything should use as less variables as possible.
Anyone can provide and ideas on how to tackle this ? My code works but I had to define $path1 , $path2 , $path3 and so on, and then go for a loop on each, which is great but I need to make it clean and less lines of code .
$server1 = "C:\Users\nicolae.calimanu\Documents\B\"
$server2 = "C:\Users\nicolae.calimanu\Documents\A\" # UNC Path.
$datetime = Get-Date -Format "MMddyyyy-HHmmss"
$server3 = "C:\Users\nicolae.calimanu\Documents\C\" # UNC Path.
foreach ($server1 in gci $server1 -recurse)
{
Copy-Item -Path $server1.FullName -Destination $server2
}
ForEach ( $server2 in $server2 ) {
$curDateTime = Get-Date -Format yyyyMMdd-HHmmss
Get-ChildItem $server2 -Recurse |
Rename-Item -NewName {$_.Basename + '_' + $curDateTime + $_.Extension }
}
foreach ($server2 in gci $server2 -Recurse)
{
Move-Item -path $server2 -destination "C:\Users\nicolae.calimanu\Documents\C"
}
Use a hashtable to create a key-value store for each source and destination. Like so,
# Create entries for each source and destination
$ht = #{}
$o = new-object PSObject -property #{
from = "\\serverA\source"
to = "\\serverB\destination" }
$ht.Add($o.from, $o)
$o = new-object PSObject -property #{
from = "\\serverC\source"
to = "\\serverB\destination2" }
$ht.Add($o.from, $o)
$o = new-object PSObject -property #{
from = "\\servera\source2"
to = "\\serverC\destination" }
$ht.Add($o.from, $o)
# Iterate the collection. For demo, print the copy commands
foreach($server in $ht.keys) { $cmd = $("copy-item {0} {1}" -f $ht.Item($server).from, $ht.Item($server).to); $cmd }
# Sample output
copy-item \\serverA\source \\serverB\destination
copy-item \\servera\source2 \\serverC\destination
copy-item \\serverC\source \\serverB\destination2
I'm trying to create a powershell script that will grab all Active Directory accounts that are enabled, and inactive for 90 days. The script will prompt the user to choose between querying computer or user accounts.
Depending on the choice, it will pass it over to the main command as a variable.
The commands work correctly if I don't pass a variable.
I'm not sure if what I'm trying to do is possible.
Sorry for any bad code formatting. Just starting out.
Clear-Host
write-host "`nProgram searches for Enabled AD users account that have not logged in for more than 90 days. `nIt searches the entire domain and saves the results to a CSV file on users desktop." "`n"
$choice = Read-host -Prompt " What do you want to search for Computer or Users Accounts`nType 1 for users`nType 2 for Computers`n`nChoice"
$account
if ($choice -eq 1) {
$account = UsersOnly
}
Elseif ($choice -eq 2) {
$account = ComputersOnly
}
Else {
write-host "This is not an option `n exiting program"
exit
}
$FileName = Read-Host -Prompt "What do you want to name the CSV file"
$folderPath = "$env:USERPROFILE\Desktop\$FileName.csv"
Search-ADAccount -AccountInactive -TimeSpan 90 -$account | Where-Object { $_.Enabled -eq $true } | select Name, UserPrincipalName, DistinguishedName | Export-Csv -Path $folderPath
Splatting is the way to achieve this. It's so named because you reference a variable with # instead of $ and # kind of looks a "splat".
it works by creating a hashtable, which is a type of dictionary (key/value pairs). In PowerShell we create hashtable literals with #{}.
To use splatting you just make a hashtable where each key/value pair is a parameter name and value, respectively.
So for example if you wanted to call Get-ChildItem -LiteralPath $env:windir -Filter *.exe you could also do it this way:
$params = #{
LiteralPath = $env:windir
Filter = '*.exe'
}
Get-ChildItem #params
You can also mix and match direct parameters with splatting:
$params = #{
LiteralPath = $env:windir
Filter = '*.exe'
}
Get-ChildItem #params -Verbose
This is most useful when you need to conditionally omit a parameter, so you can turn this:
if ($executablesOnly) {
Get-ChildItem -LiteralPath $env:windir -Filter *.exe
} else {
Get-ChildItem -LiteralPath $env:windir
}
Into this:
$params = #{
LiteralPath = $env:windir
}
if ($executablesOnly) {
$params.Filter = '*.exe'
}
Get-ChildItem #params
or this:
$params = #{}
if ($executablesOnly) {
$params.Filter = '*.exe'
}
Get-ChildItem -LiteralPath $env:windir #params
With only 2 possible choices, the if/else doesn't look that bad, but as your choices multiply and become more complicated, it gets to be a nightmare.
Your situation: there's one thing I want to note first. The parameters you're trying to alternate against are switch parameters. That means when you supply them you usually only supply the name of the parameter. In truth, these take boolean values that default to true when the name is supplied. You can in fact override them, so you could do Search-ADAccount -UsersOnly:$false but that's atypical.
Anyway the point of mentioning that is that it may have been confusing how you would set its value in a hashtable for splatting purposes, but the simple answer is just give them a boolean value (and usually it's $true).
So just changing your code simply:
$account = if ($choice -eq 1) {
#{ UsersOnly = $true }
} elseif ($choice -eq 2) {
#{ ComputersOnly = $true }
}
# skipping some stuff
Search-ADAccount -AccountInactive -TimeSpan 90 #account
I also put the $account assignment on the left side of the if instead of inside, but that's your choice.
My aim is to compare two directories exactly - including the structure of the directories and sub-directories.
I need this, because I want to monitor if something in the folder E:\path2 was changed. For this case a copy of the full folder is in C:\path1. If someone changes something it has to be done in two directories.
It is important for us, because if something is changed in the directory (accidentally or not) it could break down other functions in our infrastructure.
This is the script I've already written:
# Compare files for "copy default folder"
# This Script compares the files and folders which are synced to every client.
# Source: https://mcpmag.com/articles/2016/04/14/contents-of-two-folders-with-powershell.aspx
# 1. Compare content and Name of every file recursively
$SourceDocsHash = Get-ChildItem -recurse –Path C:\path1 | foreach {Get-FileHash –Path $_.FullName}
$DestDocsHash = Get-ChildItem -recurse –Path E:\path2 | foreach {Get-FileHash –Path $_.FullName}
$ResultDocsHash = (Compare-Object -ReferenceObject $SourceDocsHash -DifferenceObject $DestDocsHash -Property hash -PassThru).Path
# 2. Compare name of every folder recursively
$SourceFolders = Get-ChildItem -recurse –Path C:\path1 #| where {!$_.PSIsContainer}
$DestFolders = Get-ChildItem -recurse –Path E:\path2 #| where {!$_.PSIsContainer}
$CompareFolders = Compare-Object -ReferenceObject $SourceFolders -DifferenceObject $DestFolders -PassThru -Property Name
$ResultFolders = $CompareFolders | Select-Object FullName
# 3. Check if UNC-Path is reachable
# Source: https://stackoverflow.com/questions/8095638/how-do-i-negate-a-condition-in-powershell
# Printout, if UNC-Path is not available.
if(-Not (Test-Path \\bb-srv-025.ftscu.be\DIP$\Settings\ftsCube\default-folder-on-client\00_ftsCube)){
$UNCpathReachable = "UNC-Path not reachable and maybe"
}
# 4. Count files for statistics
# Source: https://stackoverflow.com/questions/14714284/count-items-in-a-folder-with-powershell
$count = (Get-ChildItem -recurse –Path E:\path2 | Measure-Object ).Count;
# FINAL: Print out result for check_mk
if($ResultDocsHash -Or $ResultFolders -Or $UNCpathReachable){
echo "2 copy-default-folders-C-00_ftsCube files-and-folders-count=$count CRITIAL - $UNCpathReachable the following files or folders has been changed: $ResultDocs $ResultFolders (none if empty after ':')"
}
else{
echo "0 copy-default-folders-C-00_ftsCube files-and-folders-count=$count OK - no files has changed"
}
I know the output is not perfect formatted, but it's OK. :-)
This script spots the following changes successfully:
create new folder or new file
rename folder or file -> it is shown as error, but the output is empty. I can live with that. But maybe someone sees the reason. :-)
delete folder or file
change file content
This script does NOT spot the following changes:
move folder or file to other sub-folder. The script still says "everything OK"
I've been trying a lot of things, but could not solve this.
Does anyone can help me how the script can be extended to spot a moved folder or file?
I think your best bet is to use the .NET FileSystemWatcher class. It's not trivial to implement an advanced function that uses it, but I think it will simplify things for you.
I used the article Tracking Changes to a Folder Using PowerShell when I was learning this class. The author's code is below. I cleaned it up as little as I could stand. (That publishing platform's code formatting hurts my eyes.)
I think you want to run it like this.
New-FileSystemWatcher -Path E:\path2 -Recurse
I could be wrong.
Function New-FileSystemWatcher {
[cmdletbinding()]
Param (
[parameter()]
[string]$Path,
[parameter()]
[ValidateSet('Changed', 'Created', 'Deleted', 'Renamed')]
[string[]]$EventName,
[parameter()]
[string]$Filter,
[parameter()]
[System.IO.NotifyFilters]$NotifyFilter,
[parameter()]
[switch]$Recurse,
[parameter()]
[scriptblock]$Action
)
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
If (-NOT $PSBoundParameters.ContainsKey('Path')){
$Path = $PWD
}
$FileSystemWatcher.Path = $Path
If ($PSBoundParameters.ContainsKey('Filter')) {
$FileSystemWatcher.Filter = $Filter
}
If ($PSBoundParameters.ContainsKey('NotifyFilter')) {
$FileSystemWatcher.NotifyFilter = $NotifyFilter
}
If ($PSBoundParameters.ContainsKey('Recurse')) {
$FileSystemWatcher.IncludeSubdirectories = $True
}
If (-NOT $PSBoundParameters.ContainsKey('EventName')){
$EventName = 'Changed','Created','Deleted','Renamed'
}
If (-NOT $PSBoundParameters.ContainsKey('Action')){
$Action = {
Switch ($Event.SourceEventArgs.ChangeType) {
'Renamed' {
$Object = "{0} was {1} to {2} at {3}" -f $Event.SourceArgs[-1].OldFullPath,
$Event.SourceEventArgs.ChangeType,
$Event.SourceArgs[-1].FullPath,
$Event.TimeGenerated
}
Default {
$Object = "{0} was {1} at {2}" -f $Event.SourceEventArgs.FullPath,
$Event.SourceEventArgs.ChangeType,
$Event.TimeGenerated
}
}
$WriteHostParams = #{
ForegroundColor = 'Green'
BackgroundColor = 'Black'
Object = $Object
}
Write-Host #WriteHostParams
}
}
$ObjectEventParams = #{
InputObject = $FileSystemWatcher
Action = $Action
}
ForEach ($Item in $EventName) {
$ObjectEventParams.EventName = $Item
$ObjectEventParams.SourceIdentifier = "File.$($Item)"
Write-Verbose "Starting watcher for Event: $($Item)"
$Null = Register-ObjectEvent #ObjectEventParams
}
}
I don't think any example I've found online tells you how to stop watching the filesystem. The simplest way is to just close your PowerShell window. But I always seem to have 15 tabs open in each of five PowerShell windows, and closing one of them is a nuisance.
Instead, you can use Get-Job to get the Id of registered events. Then use Unregister-Event -SubscriptionId n to, well, unregister the event, where 'n' represents the number(s) you find in the Id property of Get-Job..
So basically you want to synchronize the two folders and note all the changes made on that:
I would suggest you to use
Sync-Folder Script
Or
FreeFile Sync.