I m trying to write a script to compare permissions of two folders at two different locations. The Folder name at both locations would be the ADID of an user and some users may have two or more AD accounts.
$OutPath = ".\out1.csv"
$sourcepath1 = Get-Content ".\mwd_src.txt"
foreach ($path1 in $sourcepath1) {
$name1 = (Get-Item $Path1).name
$FolderAcl1 = (Get-Acl $Path1).Access | Select-Object IdentityReference
$Source = $FolderAcl1 | Where-Object { $_.IdentityReference -like "*$name1*"} |
Select-Object #{ Label = "Path"; Expression = { echo $Path1 } }, #{ Label = "Access"; Expression = { $_.IdentityReference } }
}
$sourcepath2 = Get-Content ".\mwd_dest.txt"
foreach ($path2 in $sourcepath2) {
$name2 = (Get-Item $Path2).Name
$FolderAcl2 = (Get-Acl $Path2).Access | Select-Object IdentityReference
$Dest = $FolderAcl2 | Where-Object { $_.IdentityReference -like "*$name2*" } |
Select-Object #{ Label = "Path"; Expression = { echo $Path2 } }, #{ Label = "Access"; Expression = { $_.IdentityReference } }
}
$Source1 = $Source | Select-Object -Unique
$Dest1 = $Dest | Select-Object -Unique
$Out = Compare-Object -ReferenceObject $Source1 -DifferenceObject $Dest1
$Out1 = $Out | Where-Object { $_.SideIndicator -match "=>" }
foreach($OutItem in $Out1) {
$Outitem.InputObject | Add-Content $OutPath
}
But getting the following error
"Compare-Object : Cannot bind argument to parameter 'DifferenceObject' because it is null.
"
Please assist.
As I am missing information to your paths, I cannot reproduce your code but I tried to compare the ACLs for two local files with Compare-Object. I create two files which initially had the same accesses. If there is no difference, nothing is returned. I removed my account then from the permissions of the second file and got the output. (I shortened the output to avoid a horizontal bar.) The SideIndicator shows with the arrow to which object the difference belong.
InputObject SideIndicator
----------- -------------
{System.Security.AccessControl.FileSystemAccessRule, System.Security... =>
{System.Security.AccessControl.FileSystemAccessRule, System.Security... <=
My test code:
$ref = Get-Acl -Path .\Desktop\test1.txt
$diff = Get-Acl - Path .\Desktop\test2.txt
Compare-Object -ReferenceObject $ref.Access -DifferenceObject $diff.Access
I selected the Access property for comparison.
Regarding your error, ensure that the DifferenceObject is populated with the information you expect.
Related
I have looked all over and I can only find examples of filtering an array by a single value, not an object by an array.
Here is what I have that works but seems kludgy to use a foreach loop, is there a way to filter my object by my array of "bad users"?
$permissions = dir -Recurse $path | where { $_.PsIsContainer } | ForEach-Object { $path1 = $_.fullname; Get-Acl $_.Fullname | ForEach-Object { $_.access }}
$arrFilterDefaulsOut = #('NT AUTHORITY\SYSTEM','BUILTIN\Administrators','BUILTIN\Users','CREATOR OWNER')
foreach ($filter in $arrFilterDefaulsOut) {
$permissions = $permissions | Select-Object * | Where-Object -property IdentityReference -ne $filter
}
$permissions | Select-Object *| Export-Csv $finalReport
I have tried using -notcontains but that seems to do nothing
If I understand you correctly you want to filter out the defined identities, so you can do:
$permissionFiltered = $permissions | where-object {$arrFilterDefaulsOut -notcontains $_.IdentityReference}
I think I understand what you were trying to do before with Add-Member before, this is how I would streamline the process keeping it efficient using an anonymous function that filters those FileSystemAccessRule where IdentityReference is not in the $arrFilterDefaulsOut array and creates new objects from those rules adding the absolute path of the folders.
Get-ChildItem -Recurse $path -Directory -PipelineVariable folder | & {
begin {
$arrFilterDefaulsOut = #(
'NT AUTHORITY\SYSTEM'
'BUILTIN\Administrators'
'BUILTIN\Users'
'CREATOR OWNER'
)
}
process {
foreach($rule in (Get-Acl $_.FullName).Access) {
if($rule.IdentityReference -notin $arrFilterDefaulsOut) {
$rule | Select-Object #{ N='Path'; E={ $folder.FullName }}, *
}
}
}
} | Export-Csv path\to\report.csv -NoTypeInformation
I have a script which will compare two very similar directories to see which has newer, updated files. The two paths have the same files but the path names are locations that are slightly different. The two folders have about the same amount of files, maybe two or three less than the other.
$path1 = "E:\docs\training\files"
$path2 = "D:\docs\training - Copy\files"
$outdatedFiles = #()
foreach($file in $Folder1)
{
foreach($file2 in $Folder2)
{
if($file2.BaseName -match $file.BaseName)
{
if($file.LastWriteTime -gt $file2.LastWriteTime)
{
$Result = "" | Select OutDatedFile,LastWriteTime
$Result.OutDatedFile = $file2.FullName
$Result.LastWriteTime = $file2.LastWriteTime
$outdatedFiles += $Result
}
}
}
}
In the $outdatedFiles array, I get files that are not newer than their counterpart in the other directory. I think it might be due to my comparison in the if statement, I tried -match, -contains, and -ccontains to see if any of these would give me what I wanted. Neither worked. It might be that the foreach doesn't work due to the slightly different amount of files in each folder. Any suggestions?
EDIT
I tried building a hash but this did not find all the updated files:
$outdatedFiles = #()
foreach($file in $Folder1)
{
foreach($file2 in $Folder2)
{
if($file2.Name -like $file.Name)
{
#compare hash here
$Hash2 = Get-FileHash $file2.FullName -Algorithm SHA256
$Hash1 = Get-FileHash $file.FullName -Algorithm SHA256
if($Hash2.Hash -ne $Hash1.Hash)
{
$Result = "" | Select OutDatedFile,LastWriteTime
$Result.OutDatedFile = $file2.FullName
$Result.LastWriteTime = $file2.LastWriteTime
$outdatedFiles += $Result
}
}
}
}
EDIT
This was my solution
$Differences1 = #()
foreach($file in $Folder1)
{
foreach($file2 in $Folder2)
{
<#Trim path then compare#>
#File1
$file1part1 = ($file.FullName).Split("\")[-2]
$file1part2 = ($file.FullName).Split("\")[-1]
$newPath1 = $file1part1 + "\" + $file1part2
#File2
$file2part1 = ($file2.FullName).Split("\")[-2]
$file2part2 = ($file2.FullName).Split("\")[-1]
$newPath2 = $file2part1 + "\" + $file2part2
if($newPath1 -like $newPath2)
{
$Differences1 += Compare-Object (gci $file2.FullName) -DifferenceObject (gci $file.FullName) -Property LastWriteTime -PassThru | Select Name,FullName,LastWriteTime | Sort-Object -Property Name
}
}
}
if($Differences1 -ne $null)
{
$Differences1 | Out-File $textFile -Append
}
else
{
"No folders have different modified dates" | Out-File $textFile -Append
}
In the end, the solution was more complex than I wanted, or maybe I just made it that way.
The problem was I had multiple files with the same name and similar paths, as in the subfolder was named the same. I had to trim the path to be able to get a better comparison:
$Differences1 = #()
foreach($file in $Folder1)
{
foreach($file2 in $Folder2)
{
<#Trim path then compare#>
#File1
$file1part1 = ($file.FullName).Split("\")[-2]
$file1part2 = ($file.FullName).Split("\")[-1]
$newPath1 = $file1part1 + "\" + $file1part2
#File2
$file2part1 = ($file2.FullName).Split("\")[-2]
$file2part2 = ($file2.FullName).Split("\")[-1]
$newPath2 = $file2part1 + "\" + $file2part2
if($newPath1 -like $newPath2)
{
$Differences1 += Compare-Object (gci $file2.FullName) -DifferenceObject (gci $file.FullName) -Property LastWriteTime -PassThru | Select Name,FullName,LastWriteTime | Sort-Object -Property Name
}
}
}
if($Differences1 -ne $null)
{
$Differences1 | Out-File $textFile -Append
}
else
{
"No folders have different modified dates" | Out-File $textFile -Append
}
I have worked on a little project which is to extract some information out of a file server. To perform that projet I have created a script that outputs all the information in a .csv file. The problem is that Powershell eats up all my computer's RAM during the process because there is like hundreds Gb of data to parse.
Hereunder is my script.
$folder = Get-ChildItem -Recurse 'Complete_Path' | select FullName, #{Name="Owner";Expression={(Get-Acl $_.FullName).Owner}}, CreationTime, LastWriteTime, LastAccessTime, PSIsContainer | sort FullName
$output = #()
$folder | foreach {
$type =
if ($_.PSIsContainer -eq "True") {
Write-Output "Folder"
}
else {
Write-Output "File"
}
$size =
if ($_.PSIsContainer -eq "True") {
Get-ChildItem -Recurse $_.FullName | measure -Property Length -Sum -ErrorAction SilentlyContinue | select -ExpandProperty Sum
}
else {
Get-Item $_.FullName | measure -Property Length -Sum -ErrorAction SilentlyContinue | select -ExpandProperty Sum
}
$hash = #{
FullName = $_.FullName
Owner = $_.Owner
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
LastAccessTime = $_.LastAccessTime
Type = $type
'Size in MB' = [math]::Round($($size/1Mb),2)
}
$output += New-Object PSObject -Property $hash
}
$output | select FullName, Owner, CreationTime, LastWriteTime, LastAccessTime, Type, 'Size in MB' | Export-Csv C:\myDOCS.csv -Delimiter ";" -NoTypeInformation -Encoding UTF8
Have you guys any idea how can I get the job done faster and less ram consuming? It can take days to get the extraction.
Thank you in advance.
Replace your Powershell array $output=#() with A .Net PSObject list $output = [System.Collections.Generic.List[psobject]]::new() and use the .Add method of that object to add your items.
For small list, you won't notice but using Powershell array and += operator is a big performance sink. Each time you do +=, array is recreated entirely with one more item.
Include Length in your initial Get-ChildItem statement. Later on, you can measure the sum without going through Get-ChildItem again all the time
Pipeline play nice on memory but slower overall. I tend to prefer not using the pipeline when performance become an issue.
Something like that should already be significantly faster
$folder = Get-ChildItem -Recurse "$($env:USERPROFILE)\Downloads" | select FullName, #{Name = "Owner"; Expression = { (Get-Acl $_.FullName).Owner } }, CreationTime, LastWriteTime, LastAccessTime, PSIsContainer, Length | sort FullName
$output = [System.Collections.Generic.List[psobject]]::new()
foreach ($Item in $folder) {
if ($Item.PSIsContainer) {
$Type = 'Folder'
$size = $folder.Where( { $_.FullName -like $item.FullName }).FullName | measure -Property Length -Sum -ErrorAction SilentlyContinue | select -ExpandProperty Sum
}
else {
$Type = 'File'
$size = $Item.Length
}
$size = [math]::Round($($size / 1Mb), 2)
$hash = #{
FullName = $Item.FullName
Owner = $Item.Owner
CreationTime = $Item.CreationTime
LastWriteTime = $Item.LastWriteTime
LastAccessTime = $Item.LastAccessTime
Type = $Type
'Size in MB' = $size
}
[void]($output.Add((New-Object PSObject -Property $hash)))
}
$output | select FullName, Owner, CreationTime, LastWriteTime, LastAccessTime, Type, 'Size in MB' | Export-Csv C:\myDOCS.csv -Delimiter ";" -NoTypeInformation -Encoding UTF8
You could still improve on the size calculation so the deepest of folders size is calculated first, then parent folder can grab the value and sum up the children folder instead of recalculating the files
Another thought would be to not do the Get-ACl immediately (I suspect this one is slow to perform ) and get your items, do the rest, then parallelize the Get-ACL so you can get the values on a number of parallel threads and add the the value as you get it to your list.
Think about testing your code on smaller batches and use the Measure-Command to determine where are the slowest operation in your code.
I recommend you taking a look at some more advanced topic on the subject.
Here's a good article to get your started : Slow Code: Top 5 ways to make your Powershell scripts run faster
Is this better with the whole thing in one pipeline?
Get-ChildItem -Recurse |
select FullName, #{Name="Owner";Expression={(Get-Acl $_.FullName).Owner}},
CreationTime, LastWriteTime, LastAccessTime, PSIsContainer | sort FullName |
foreach {
$type =
if ($_.PSIsContainer -eq "True") {
Write-Output "Folder"
}
else {
Write-Output "File"
}
$size =
if ($_.PSIsContainer -eq "True") {
Get-ChildItem -Recurse $_.FullName |
measure -Property Length -Sum -ErrorAction SilentlyContinue |
select -ExpandProperty Sum
}
else {
Get-Item $_.FullName |
measure -Property Length -Sum -ErrorAction SilentlyContinue |
select -ExpandProperty Sum
}
$hash = #{
FullName = $_.FullName
Owner = $_.Owner
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
LastAccessTime = $_.LastAccessTime
Type = $type
'Size in MB' = [math]::Round($($size/1Mb),2)
}
New-Object PSObject -Property $hash
} | select FullName, Owner, CreationTime, LastWriteTime, LastAccessTime,
Type, 'Size in MB' |
Export-Csv myDOCS.csv -Delimiter ";" -NoTypeInformation -Encoding UTF8
The purpose of this code is to get a list of all used executables from a specific folder. After a month we will delete any exe's not on this list.
I currently get the correct results using this:
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
New-Object -TypeName PSObject -Property #{
'Path' = $process.Path;
} | Out-String | Add-Content -LiteralPath Z:\processList.txt
}
Get-Content Z:\processList.txt | sort | Get-Unique > Z:\uniqueprocesslist.txt
}
I'm going to get rid of the while loop as this will be eventually running as a service.
The problem with this is that it creates a huge list in processlist.txt that I would like to eliminate to save space.
I tried to come up with a better solution that scans the text file to see if the path is written already before adding the new process path. I am not sure what I am doing wrong but nothing is ever written to the text file
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
$progPath = New-Object -TypeName PSObject -Property #{
'Path' = $process.Path
}
$file = Get-Content "Z:\processList.txt"
$containsLine = $file | %{$_ -match $progPath}
if ($containsLine -contains $false) {
Add-Content -LiteralPath Z:\processList.txt
}
}
}
If I understand your question correctly you want to build a "recently used" list of executables in a specific directory in a file, and update that (unique) list with each run of your script.
Something like this should do that:
$listfile = 'Z:\processlist.txt'
# Build a dictionary from known paths, so that we can check for already known
# paths with an index lookup instead of a linear search over an array.
$list = #{}
if (Test-Path -LiteralPath $listfile) {
Get-Content $listfile | ForEach-Object {
$list[$_] = $true
}
}
# List processes, expand their path, then check if the path contains the
# string "ksv" and isn't already known. Append the results to the list file.
Get-Process |
Select-Object -Expand Path |
Sort-Object -Unique |
Where-Object {$_ -like '*ksv*' -and -not $list.ContainsKey($_)} |
Add-Content $listfile
Hashtable lookup and wildcard match are used for performance reasons, because they're significantly faster than linear searches in arrays and regular expression matches.
while ($true) {
$file = Get-Content "Z:\processList.txt"
$KSVPaths = Get-Process |
Where-Object {$_.Path -imatch 'ksv'} |
Select-Object -ExpandProperty Path |
Select-Object -Unique
ForEach ($KSVPath in $KSVPaths) {
if ($KSVPath -notin $file) {
Add-Content -Path $file -Value $KSVPath
}
}
}
I have a PowerShell script which compares two files in two different folders. If a file with the proper number exists in the first folder then it runs it.
If the file doesn't exist in the first folder then it copies it from the second folder to the first folder and runs it from the first folder.
function Invoke-InstallationOfANewBuild()
{
param (
$ptud = "$($env:USERPROFILE)\Desktop\",
$ptbf = "\\r\P\Al\O\D B\R 017\x64"
)
begin {
$output1 = Get-ChildItem $ptbf -Filter *.exe | Where Name -NotMatch '.*NoDB\.exe$' | % {
New-Object psobject -Property #{
No = [int]([regex]::Match($_.Name, '(?<=CL)\d+').Value)
Name = $_.FullName
}
} | Sort No -Descending | Select -ExpandProperty Name -First 1
$output2 = Get-ChildItem $ptbf -Filter *.exe | Where Name -NotMatch '.*NoDB\.exe$' | % {
New-Object psobject -Property #{
No = [int]([regex]::Match($_.Name, '(?<=CL)\d+').Value)
Name = $_.FullName
} | Sort No -Descending | Select -ExpandProperty Name -First 1
}
Compare-Object -ReferenceObject $output1 -DifferenceObject $output2
}
process {
if ($LASTEXITCODE = 0)
{
Get-ChildItem $ptud -Filter *.exe | Where Name -NotMatch '.*NoDB\.exe$' | % {
New-Object psobject -Property #{
No = [int]([regex]::Match($_.Name, '(?<=CL)\d+').Value)
Name = $_.FullName
}
} | Sort No -Descending | Select -ExpandProperty Name -First 1 | Foreach { & $_ -s2 -sp"-SilentInstallation=standalone -UpdateMaterials=yestoall -UpgradeDBIfRequired=yes" }
}
else
{
Get-ChildItem $ptbf -Filter *.exe | Where Name -NotMatch '.*NoDB\.exe$' | % {
New-Object psobject -Property #{
No = [int]([regex]::Match($_.Name, '(?<=CL)\d+').Value)
Name = $_.FullName
}
} | Sort No -Descending | Select -ExpandProperty Name -First 1 | Copy-Item -Destination $ptud | Foreach { & $_ -s2 -sp"-SilentInstallation=standalone -UpdateMaterials=yestoall -UpgradeDBIfRequired=yes" }
}
}
end { return $LASTEXITCODE }
}
I have a problem in the else block - file copies from the second folder to the first folder but the file execution is not started.
Also I am looking for better solution with if block. I want to say - if operation Compare-Object returns true than start everything in if block, if operation returns false (for example file with such doesn't exist in 1st folder) -than start everything in else block.
For your compare try this:
$compare = Compare-Object -ReferenceObject $A -DifferenceObject $B |
Where-Object { $_.SideIndicator -eq '=>' } |
Measure-Object -Property inputObject
$compare.count -gt 0 # for your if condition
for your copy-object problem, try this:
the Tee-Object wil duplicate the pipeline to a variable
Get-ChildItem $ptbf -Filter *.exe | Where Name -NotMatch '.*NoDB\.exe$' | % {
New-Object psobject -Property #{
No = [int]([regex]::Match($_.Name, '(?<=CL)\d+').Value)
Name = $_.FullName
}
} | Sort No -Descending | Select -ExpandProperty Name -First 1 | Tee-Object -variable Duplicate | Copy-Item -Destination $ptud
$duplicate | Foreach { & $_ -s2 -sp"-SilentInstallation=standalone -UpdateMaterials=yestoall -UpgradeDBIfRequired=yes" }