I have following powershell script:
$reportsFolder = $PSScriptRoot;
$reportServerDbInstance = "localhost";
$dbName = "dbname";
function Execute-Query{
Param(
[parameter(position=0)]
$query,
[parameter(position=1)]
$execDb
)
Invoke-Sqlcmd -Query $query -ServerInstance $reportServerDbInstance -Database $execDb -ErrorAction 'Stop'
}
function Test-ReportQuery($query, $execDb) {
Execute-Query -query $query -execDb $execDb
}
function Test-ReportQueries($reportName, $queries) {
foreach($q in $queries.GetEnumerator()) {
try {
Test-ReportQuery -query $q.Value -execDb $dbName
}
catch {
$errorMsg = "Dataset `"$($q.Name)`" for report `"$reportName`" failed. Check the query";
Write-Error $errorMsg
Write-Error $q.Value
Write-Error $_;
}
}
}
function Prepare-QueriesFromReportForTest ($folder, $name)
{
[OutputType([System.Collections.Hashtable])]
$tmpReportString = gc $(Join-Path $folder "$name.rdl");
[xml]$report = $tmpReportString;
foreach($ds in $report.Report.DataSources.DataSource) {
if ($ds.DataSourceReference -ne "rdb_custom") {
Write-Host "Report $name skipped as datasource is not rdb_custom";
return;
}
}
$paramsFromReport = $report.Report.ReportParameters;
$params = #{}
$tmpParams = #{}
foreach ($p in $paramsFromReport.ReportParameter ) {
$value = "''";
if ($p.DataType -eq "DateTime" ) {
$value = "GETDATE()";
}
if ($p.DataType -eq "Boolean") {
continue;
}
if ($p.DataType -eq "Float" ) {
$value = "0";
}
if ($p.DataType -eq "Integer" ) {
$value = "0";
}
9009
$params.add($p.Name, $value);
$tmpParams.add($p.Name, $value);
}
$testQueries = $null;
$testQueries = #{};
foreach($q in $report.Report.DataSets.DataSet) {
if ($q.Query -eq $null) {
continue;
}
#getting the dataset parameters that <> to report parameters
foreach($p in $q.Query.QueryParameters.QueryParameter) {
foreach($p1 in $tmpParams.GetEnumerator()) {
if (($p.Value -eq "=Parameters!$($p1.Name).Value".Replace("#","")) -and ($p.Name -ne "#$($p1.Name)") ) {
$params.add($p.Name.Replace("#",""), $p1.Value);
}
}
}
$query = $q.Query.CommandText.Trim().Replace(";","").ToLower();
foreach ($param in $params.GetEnumerator()) {
$query = $query.Replace("#$($param.Name.ToLower())", $param.Value );
}
if ($query.Substring(0,6).ToLower() -eq "select") {
$testQueries.add($q.Name, "SELECT * FROM (SELECT TOP(0) " + $query.Substring(6, $query.Length -6) + ") a");
}
}
$testQueries
#return $testQueries;
}
Get-ChildItem $reportsFolder -Filter Water_Queue_Detail_Report.rdl |
Foreach-Object {
$fileName = $_.BaseName;
Write-Host "Testing $fileName report";
$queriesToTest = Prepare-QueriesFromReportForTest -folder $reportsFolder -name $fileName
if ($queriesToTest -ne $null) {
Test-ReportQueries -reportName $fileName -queries $queriesToTest;
}
}
When I am debugging the Prepare-QueriesFromReportForTest function just before it finishes it has proper values in the hash table:
However the variable $queriesToTest getting some weird values (some integer 9009) + expected value:
I checked the types of both objects and I see that in the function the object is hash, but the function returns array.
What's wrong?
The Prepare-QueriesFromReportForTest function has a lone line containing only number 9009 in the foreach loop. Remove that:
if ($p.DataType -eq "Integer" ) {
$value = "0";
}
9009
$params.add($p.Name, $value);
$tmpParams.add($p.Name, $value);
Related
Please reference earlier the question :
How to get replace by id (Id value get from XML) and put into CSV, powershell
I need add ForEach-Object instead of ForEach, I have 11,000 rows. So, it takes too much time.
foreach ($csvRow in $csvRows) {
foreach ($columnName in $columnNames) {
$id = $csvRow.$columnName
if (-not $id) { continue }
$newId = #($xmlDoc.enum_types.enum_type.Where( { $_.field_name -eq $columnName }, 'First').
items).ForEach( { $_.item }).Where( { $_.id -eq $id }).value
$csvRow.$columnName = $newId
}
}
$csvRows | Export-Csv -NoTypeInformation -Encoding utf8 $CSVpath
One thing you could do is pre-compute your lookups from the "enums" xml in your other question.
Original code - 7649 milliseconds
# setup
$xml = #"
<enum_types>
<enum_type field_name="Test1">
<items>
<item>
<id>1</id>
<value>A</value>
</item>
</items>
</enum_type>
<enum_type field_name="Test2">
<items>
<item>
<id>1</id>
<value>A</value>
</item>
</items>
</enum_type>
</enum_types>
"#;
$enums = [xml] $xml;
$csv = #("Test1, Test2");
$csv += 1..110000 | % { "1, 1" }
$data = $csv | ConvertFrom-Csv
$columnNames = #( "Test1", "Test2" );
# perf test - 7649 milliseconds
measure-command -expression {
foreach ($csvRow in $data)
{
foreach ($columnName in $columnNames)
{
$id = $csvRow.$columnName
if ( -not $id )
{
continue;
}
$newId = #($enums.enum_types.enum_type.Where( { $_.field_name -eq $columnName }, 'First').
items).ForEach( { $_.item }).Where( { $_.id -eq $id }).value
$csvRow.$columnName = $newId
}
}
}
With pre-computed lookups - 792 milliseconds
# setup
... as above ...
# pre-compute lookups
$lookups = #{};
foreach( $enum_type in $enums.enum_types.enum_type )
{
$lookups[$enum_type.field_name] = #{}
foreach( $item in $enum_type.items.item )
{
$lookups[$enum_type.field_name][$item.id] = $item.value
}
}
write-host ($lookups | convertto-json -depth 99)
# {
# "Test1": {
# "1": "A"
# },
# "Test2": {
# "1": "A"
# }
# }
# perf test - 792 milliseconds
measure-command -expression {
foreach ($csvRow in $data)
{
foreach ($columnName in $columnNames)
{
$id = $csvRow.$columnName
if (-not $id)
{
continue;
}
$csvRow.$columnName = $lookups[$columnName][$id]
}
}
}
There's possibly more you could squeeze out of this approach, but it's already a 10x speedup (in my very limited benchmarks).
Preload all the new items into hash tables:
$Types = $xmlDoc.enum_types.enum_type
$HashTable = #{}
Import-Csv .\1.Csv |ForEach-Object { $Names = $Null } {
if (!$Names) {
$Names = $_.psobject.Properties.Name
foreach ($Name in $Names) {
$HashTable[$Name] = #{}
foreach ($Field in $Types.Where{$_.field_name -eq $Name}) {
foreach ($Item in $Field.Items.Item) {
$HashTable[$Name][$Item.Id] = $Item.Value
}
}
}
}
foreach ($Name in $Names) {
$id = $_.$Name
If ($HashTable[$Name].Contains($Id)) {
$_.$Name = $HashTable[$Name][$Id]
}
}
$_
} |Export-Csv .\1New.Csv -NoTypeInformation -Encoding utf8
I am working on a script logging solution for an install script that has many tasks and long processing times, and I am trying to address the issue of network dropout. I am also moving to a StreamWriter based approach vs my old Add-Content approach for performance reasons.
The problem I am having is that once the network drops out, the StreamWriter doesn't reconnect. So, first question is, CAN I reconnect, or is this a limitation of StreamWriter? The fact that the StreamWriter has a cache that can be Flushed makes me think I may be doing a bunch of work to recreate functionality that is already there.
And second, I am starting to think a simpler/better solution is simply to write the log to a local folder, so the log is always complete, then simply attempt to copy that to the network location for progress review. I had been thinking about implementing parallel logs, so a loss of the network would still leave the local copy complete. Curious if anyone else looks at this and says "Well, YEAH, dufus, obviously."
function Get-PxLogFile {
return $script:pxLogFile
}
function Set-PxLogFile {
param (
[string]$path
)
if (Test-Path $path) {
Remove-Item $path -force
}
[string]$script:pxLogFile= $path
}
function Get-PxLogWriter {
$logFile = Get-PxLogFile
if (-not $script:pxFileStream) {
$script:pxFileStream = New-Object io.fileStream $logFile, 'Append', 'Write', 'Read'
$script:pxlogWriter = New-Object io.streamWriter $script:pxFileStream
} elseif ($script:pxFileStream.name -ne $logFile) {
$script:pxFileStream = New-Object io.fileStream $logFile, 'Append', 'Write', 'Read'
$script:pxlogWriter = New-Object io.streamWriter $script:pxFileStream
}
return $script:pxlogWriter
}
function Dispose-PxLogFile {
$script:pxlogWriter.Dispose()
$script:pxFileStream.Dispose()
$script:pxFileStream = $null
$script:pxlogWriter = $null
}
# Shared
function Get-PxDeferredLog {
return ,$script:deferredLog # , keeps PS from unrolling a single item array into a string
}
function Set-PxDeferredLog {
param (
[collections.arrayList]$deferredLog
)
[collections.arrayList]$script:deferredLog = $deferredLog
}
function Get-PxDeferredLogTimes {
return ($script:deferredLogTimes -join ', ')
}
function Finalize-PxLogFile {
$logWriter = Get-PxLogWriter
if (($deferredLog = Get-PxDeferredLog).count -gt 0) {
$abandonTime = (Get-Date) + (New-TimeSpan -seconds:3)
$logWriter = Get-PxLogWriter
:lastChanceLogWindow do {
$deferredLog = Get-PxDeferredLog
:deferredItemsWrite do {
try {
$logWriter.WriteLine($deferredLog[0])
if ($deferredLog.count -gt 0) {
$deferredLog.RemoveAt(0)
} else {
break :lastChanceWindow
}
} catch {
break :deferredItemsWrite
}
} while ($deferredLog.count -gt 0)
Set-PxDeferredLog $deferredLog
if ($deferredLog.count -eq 0) {
break :lastChanceWindow
}
if ((Get-Date) -gt $abandonTime) {
break :lastChanceLogWindow
}
} while ((Get-Date) -lt $abandonTime)
if ($deferredLog) {
Write-Host "Failed to write all log entries"
}
}
$script:deferredLogItems = $script:deferredLogTimes = $null
}
function Add-PxLogFileContent {
param (
[string]$string
)
$logWriter = Get-PxLogWriter
# Nested Functions
function Add-PxDeferredLogItem {
param (
[string]$item
)
if (-not $script:deferredLog) {
[collections.arrayList]$script:deferredLog = New-Object collections.arrayList
}
if ($script:deferredLog.count -eq 0) {
Start-PxDeferredLogTime
}
[void]$script:deferredLog.Add($item)
}
function Start-PxDeferredLogTime {
if (-not $script:deferredLogTimes) {
[collections.arrayList]$script:deferredLogTimes = New-Object collections.arrayList
}
if ((-not $script:deferredLogTimes) -or (-not $script:deferredLogTimes[-1].EndsWith('-'))) {
[void]$script:deferredLogTimes.Add("$((Get-Date).ToString('T'))-")
}
}
function Stop-PxDeferredLogTime {
if ($script:deferredLogTimes[-1].EndsWith('-')) {
$script:deferredLogTimes[-1] = "$($script:deferredLogTimes[-1])$((Get-Date).ToString('T'))"
}
}
$deferredLogProcessed = $false
if ([collections.arrayList]$deferredLog = Get-PxDeferredLog) {
$deferredLogProcessed = $true
:deferredItemsWrite do {
try {
$logWriter.WriteLine($deferredLog[0])
$logWriter.Flush()
$deferredLog.RemoveAt(0)
} catch {
break :deferredItemsWrite
}
} while ($deferredLog.count -gt 0)
if ($deferredLog.count -eq 0) {
$deferredLogPending = $false
} else {
$deferredLogPending = $true
}
Set-PxDeferredLog $deferredLog
} else {
$deferredLogPending = $false
}
if (-not $deferredLogPending) {
try {
if ($logWriter.WriteLine($string)) {
$logWriter.Flush()
}
if ($deferredLogProcessed) {Stop-PxDeferredLogTime}
} catch {
Add-PxDeferredLogItem $string
Write-Host "Failed: $(Get-Date)`n$($_.Exception.Message)"
}
} else {
Add-PxDeferredLogItem $string
}
}
### MAIN
Clear-Host
$script:deferredLogItems = $script:deferredLogTimes = $null
$logPath = '\\px\Content'
Write-Host 'logTest1.txt'
$startTime = Get-Date
$endTime = $startTime + (New-TimeSpan -minutes:5)
#Set-PxLogFile "$([System.IO.Path]::GetFullPath($env:TEMP))\logTest1.txt"
Set-PxLogFile "$logPath\logTest1.txt"
do {
Add-PxLogFileContent "logged: $(Get-Date)"
Start-SLeep -s:10
} while ((Get-Date) -lt $endTime)
Finalize-PxLogFile
Write-Host 'logTest2.txt'
$startTime = Get-Date
$endTime = $startTime + (New-TimeSpan -minutes:5)
#Set-PxLogFile "$([System.IO.Path]::GetFullPath($env:TEMP))\logTest2.txt"
Set-PxLogFile "$logPath\logTest2.txt"
do {
Add-PxLogFileContent "logged: $(Get-Date)"
Start-SLeep -s:10
} while ((Get-Date) -lt $endTime)
if ([string]$deferredLogTimes = Get-PxDeferredLogTimes) {
Add-PxLogFileContent "Deferred logging time ranges: $deferredLogTimes"
}
Finalize-PxLogFile
Dispose-PxLogFile
I created a report, if I run it manually from Powershell ISE, it generates the list of items I am expecting, but when I run it from Reporting Tools it returns no results.
The script scrapes all the items versions and languages, which are around 80,000 items and this takes a while.
Is there a way to add a delay until the list of all items is generated, or any other workaround ?
Source code:
$RichTextContentID = "";
$internalLinkPattern = '<a href="~\/link\.aspx\?_id=(?<sitecoreid>[a-zA-Z\d]{32})&_z=z">';
$literatureTemplate = "";
$global:guiltyItems = #();
function Process-RichText
{
param( [Parameter(Mandatory = $true)] [Sitecore.Data.Fields.Field]$field,
[Parameter(Mandatory = $true)] [string]$pattern,
[Parameter(Mandatory = $true)] [Sitecore.Data.Items.Item]$item)
$allMatches = [System.Text.RegularExpressions.Regex]::Matches($field.Value,$pattern);
foreach ($match in $allMatches)
{
$currentItem = Get-Item master -Id ([Sitecore.Data.ID]::Parse($match.Groups["sitecoreid"].Value)).Guid;
if ($currentItem.Template.Id -eq $literatureTemplate)
{
if ($global:guiltyItems -notcontains $item)
{
$global:guiltyItems += $item;
}
}
}
}
$allitems = Get-Item master -Query "/sitecore/content/MyWebsiteTree//*" -Language * -Version *;
foreach ($item in $allItems) {
foreach ($field in $item.Fields)
{
if ($field.Id -eq $RichTextContentID -and ($field.Value -match $internalLinkPattern))
{
Process-RichText $field $internalLinkPattern $item;
}
}
}
if ($global:guiltyItems.Count -eq 0) {
Show-Alert "Did not find any items to match your condition.";
}
else {
$props = #{
Title = ""
InfoDescription = ""
PageSize = 50
};
($global:guiltyItems) |
Show-ListView #props -Property #{ Label="Item name"; Expression={$_.Name}; },
#{ Label="ID"; Expression={$_.ID}; },
#{ Label="Display name"; Expression={$_.DisplayName}; },
#{ Label="Language"; Expression={$_.Language}; },
#{ Label="Version"; Expression={$_.Version}; },
#{ Label="Path"; Expression={$_.ItemPath}; },
#{ Label="Created"; Expression={$_.__Created}; },
#{ Label="Created by"; Expression={$_."__Created by"}; },
#{ Label="Updated"; Expression={$_.__Updated}; },
#{ Label="Updated by"; Expression={$_."__Updated by"}; }
}
Close-Window;
Thanks
LE: The object $allitems takes a while to be populated and the sitecore client does not wait for the backend to read all the items, and thus when I generate the report, $global:guiltyItems is always empty.
I have found the solution: using filters. And it works as it should.
$RichTextContentID = "";
$internalLinkPattern = '<a href="~\/link\.aspx\?_id=(?<sitecoreid>[a-zA-Z\d]{32})&_z=z">';
$literatureTemplateID = "";
$root = Get-Item -Path "master:/sitecore/content/MyWebsite";
filter Where-HasLiterature{
param([Parameter(Mandatory=$TRUE,ValueFromPipeline=$TRUE)][Sitecore.Data.Items.Item]$item)
if($item)
{
foreach ($field in $item.Fields)
{
if ($field.Id -eq $RichTextContentID -and ($field.Value -match $internalLinkPattern))
{
$allMatches = [System.Text.RegularExpressions.Regex]::Matches($field.Value,$internalLinkPattern);
foreach ($match in $allMatches)
{
$guiltyItem = Get-Item "master:" -Id ([Sitecore.Data.ID]::Parse($match.Groups["sitecoreid"].Value)).Guid;
$guiltyItemTemplate = [Sitecore.Data.Managers.TemplateManager]::GetTemplate($guiltyItem);
if ($guiltyItem -ne $null -and $guiltyItemTemplate.DescendsFromOrEquals($literatureTemplateID) )
{
$item;
}
}
}
}
}
}
$items = Get-ChildItem -Path $root.ProviderPath -Recurse | Where-HasLiterature
if ($items.Count -eq 0)
{
Show-Alert "Did not find any items to match your condition.";
}
else
{
$props = #{
Title = ""
InfoDescription = ""
PageSize = 50
}
$items | Show-ListView #props -Property #{ Label="Item name"; Expression={$_.Name}; },
#{ Label="ID"; Expression={$_.ID}; },
#{ Label="Display name"; Expression={$_.DisplayName}; },
#{ Label="Language"; Expression={$_.Language}; },
#{ Label="Version"; Expression={$_.Version}; },
#{ Label="Path"; Expression={$_.ItemPath}; },
#{ Label="Created"; Expression={$_.__Created}; },
#{ Label="Created by"; Expression={$_."__Created by"}; },
#{ Label="Updated"; Expression={$_.__Updated}; },
#{ Label="Updated by"; Expression={$_."__Updated by"}; }
}
I don't know much about PowerShell but have inherited a script from someone who is no longer available for assistance. This script imports AD Group Info and memberships related to Users and Computers. It works fine when run on a machine with PS 2.0 but it crashes if executed on PS 3.0 or newer.
I have not been able to figure out what needs to be modified but it seems the errors start occurring in the "Computer" membership import step and there are hundreds of errors that all say:
Command failed while processing computers: , Exception of type 'System.OutOfMemoryException' was thrown
Then at some point it looks like the script just stops and it never even gets to the 3rd step / function.
Any advice?
[Reflection.Assembly]::LoadWithPartialName("System.DirectoryServices") | Out-Null
$DBServer = "DBSERVER"
$DBName = "DBNAME"
$TableUsers = "[$DBName].[dbo].[AD_GroupToClient]"
$TableComps = "[$DBName].[dbo].[AD_GroupToDevice]"
$TableGroups = "[$DBName].[dbo].[AD_Group_Info]"
$sqldateformat = "yyyy/MM/dd HH:mm:ss:fff"
[system.Data.SqlClient.SqlConnection]$global:SqlConnection = $null
function Get-ScriptPath { $Invocation = (Get-Variable MyInvocation -Scope 1).Value; Split-Path $Invocation.MyCommand.Path }
$ScriptPath = Get-ScriptPath
$Logfile = "$ScriptPath\OutLog.log"
function Write-Logfile {
param($logtext)
[string](Get-Date -format $sqldateformat) + "`t$logtext" | Out-File $Logfile -Encoding ascii -Append
}
function Open-Database {
$global:SqlConnection = New-Object system.Data.SqlClient.SqlConnection
try {
$global:SqlConnection.ConnectionString = "Server=$DBServer;Database=$DBName;Integrated Security=True"
$global:SqlConnection.Open() | Out-Null
Write-Logfile "OK`tDatabase opened"
} catch {
Write-Host "Error Opening SQL Database`t$($_.Exception.Message)"
Write-Logfile "Error`tDatabase open failed, $($_.exception.message)"
exit
}
}
function Close-Database {
$global:SqlConnection.Close()
Write-Logfile "OK`tDatabase closed"
}
function Esc-Quote {
param($str)
if ($str) { $str.Replace("'","''") }
}
function Run-DBCommand {
param($SqlCommands, [switch]$getnumrows)
if ($SqlCommands.Count -ge 1) {
$SqlCommandText = [string]::Join(";", $SqlCommands)
try {
$SqlCmd = New-Object Data.SqlClient.SqlCommand($SqlCommandText, $SqlConnection)
$returnvalue = $SqlCmd.ExecuteNonQuery()
if ($getnumrows) { return $returnvalue }
} catch {
Write-Logfile "Error`tSQL Command failed, $($_.exception.message)"
}
}
}
function Run-GroupMemberExport {
param($exportmode)
switch ($exportmode) {
"users" {
$dom = [ADSI]"LDAP://OU=Clients123,DC=test1,DC=test2,DC=test3"
$query = "(&(objectClass=user)(objectCategory=person)(samaccountname=*))"
$table = $TableUsers
$namecolumn = "AD_Group_Member_Name"
$attribs = #("samaccountname")
}
"computers" {
$dom = [ADSI]"LDAP://DC=test1,DC=test2,DC=test3"
$query = "(&(objectClass=computer)(samaccountname=*))"
$table = $TableComps
$namecolumn = "AD_Group_Member_Device"
$attribs = #("samaccountname", "whencreated")
}
}
$starttime = (Get-Date).ToUniversalTime().ToString($sqldateformat)
$srch = New-Object DirectoryServices.DirectorySearcher($dom, $query, $attribs)
$srch.PageSize = 1000
$srch.Sort = New-Object DirectoryServices.SortOption("sAMAccountName", [DirectoryServices.SortDirection]::Ascending)
$results = $srch.FindAll()
$count = 0
$numaccounts = $results.Count
foreach ($res in $results) {
try {
$objAccount = $res.GetDirectoryEntry()
$samaccountname = $objAccount.properties["samaccountname"][0]
$whencreated = ""
if ($exportmode -eq "computers") { $whencreated = Get-Date ([datetime]$objAccount.properties["whencreated"][0]) -Format $sqldateformat }
$count++
Write-Progress "Querying accounts" $samaccountname -PercentComplete ($count * 100.0 / $numaccounts)
$objAccount.psbase.RefreshCache("tokenGroups")
$SIDs = $objAccount.psbase.Properties.Item("tokenGroups")
$groups = #()
ForEach ($Value In $SIDs) {
$SID = New-Object System.Security.Principal.SecurityIdentifier $Value, 0
try {
$Group = $SID.Translate([System.Security.Principal.NTAccount]).Value
} catch {
$Group = $SID.Translate([System.Security.Principal.SecurityIdentifier]).Value
}
if ($groups -notcontains $Group -and $Group.Split("\")[1] -ne $samaccountname) { $groups += $Group }
}
Run-DBCommand #("DELETE FROM $table WHERE [$namecolumn] = '$(Esc-Quote $samaccountname)'")
$sqlcommands = #()
$currenttime = (Get-Date).ToUniversalTime().ToString($sqldateformat)
if ($groups) {
$groups | sort | foreach {
if ($exportmode -eq "users") {
$sqlcommands += "INSERT INTO $table ([$namecolumn], [AD_Group_Name], [Last_Update]) VALUES ('$(Esc-Quote $samaccountname)', '$(Esc-Quote $_)', '$currenttime')"
} else {
$sqlcommands += "INSERT INTO $table ([$namecolumn], [AD_Group_Name], [Last_Update], [Record_Created]) VALUES ('$(Esc-Quote $samaccountname)', '$(Esc-Quote $_)', '$currenttime', '$whencreated')"
}
if ($sqlcommands.count -ge 50) { Run-DBCommand $sqlcommands; $sqlcommands = #() }
}
} else {
if ($exportmode -eq "users") {
$sqlcommands += "INSERT INTO $table ([$namecolumn], [AD_Group_Name], [Last_Update]) VALUES ('$(Esc-Quote $samaccountname)', 'ERROR: Unable to retrieve groups', '$currenttime')"
} else {
$sqlcommands += "INSERT INTO $table ([$namecolumn], [AD_Group_Name], [Last_Update], [Record_Created]) VALUES ('$(Esc-Quote $samaccountname)', 'ERROR: Unable to retrieve groups', '$currenttime', '$whencreated')"
}
}
Run-DBCommand $sqlcommands
} catch {
Write-Logfile "Error`tCommand failed while processing $exportmode`: $($objAccount.name), $($_.exception.message)"
}
}
Write-Progress " " " " -Completed
if ($count -eq $numaccounts) {
$numdeleted = Run-DBCommand #("DELETE FROM $table WHERE [Last_Update] < '$starttime' OR [Last_Update] IS NULL") -getnumrows
Write-Logfile "OK`tUpdates for $exportmode completed, $numdeleted old records deleted."
}
}
function Run-GroupDescriptionExport {
$dom = [ADSI]"LDAP://DC=test1,DC=test2,DC=test3"
$query = "(&(objectClass=group)(samaccountname=*))"
$table = $TableGroups
$attribs = #("samaccountname", "displayname", "description", "whencreated", "managedby", "grouptype","distinguishedname","whenchanged")
$srch = New-Object DirectoryServices.DirectorySearcher($dom, $query, $attribs)
$srch.PageSize = 1000
$srch.Sort = New-Object DirectoryServices.SortOption("sAMAccountName", [DirectoryServices.SortDirection]::Ascending)
$results = $srch.FindAll()
$count = 0
$numgroups = $results.Count
$sqlcommands = #()
$starttime = [datetime]::Now.ToUniversalTime().ToString($sqldateformat)
foreach ($res in $results) {
$count++
$samaccountname = $res.properties["samaccountname"][0]
Write-Progress "Querying accounts, $count/$numgroups" $samaccountname -PercentComplete ($count * 100.0 / $numgroups)
$displayName = ""; if ($res.properties.contains("displayname")) { $displayName = $res.properties["displayname"][0] }
$description = ""; if ($res.properties.contains("description")) { $description = $res.properties["description"][0] }
$managedby = ""; if ($res.properties.contains("managedby")) { $managedby = $res.properties["managedby"][0] }
$grouptype = ""; if ($res.properties.contains("grouptype")) { $grouptype = $res.properties["grouptype"][0] }
$distinguishedname = ""; if ($res.properties.contains("distinguishedname")) { $distinguishedname = $res.properties["distinguishedname"][0] }
$whencreated = ""; if ($res.properties.contains("whencreated")) { $whencreated = ([datetime]$res.properties["whencreated"][0]).ToString($sqldateformat) }
$whenchanged = ""; if ($res.properties.contains("whenchanged")) { $whenchanged = ([datetime]$res.properties["whenchanged"][0]).ToString($sqldateformat) }
$lastupdated = [datetime]::Now.ToUniversalTime().ToString($sqldateformat)
$sqlcommand = "DELETE FROM $table WHERE [AD_Group_Name] = '$(Esc-Quote $samaccountname)'; "
$sqlcommand += "INSERT INTO $table ([AD_Group_Name], [AD_Group_DisplayName], [AD_Group_Description], [Last_Update], [Managed_By],[Distinguished_Name],[Group_Category],[Created_On], AD_Last_Modified]) VALUES ('$(Esc-Quote $samaccountname)', '$(Esc-Quote $displayName)', '$(Esc-Quote $description)', '$lastupdated', '$(Esc-Quote $managedby)', '$(Esc-Quote $distinguishedname)', '$grouptype', '$whencreated','$whenchanged')"
$sqlcommands += $sqlcommand
if ($sqlcommands.count -ge 100) { Run-DBCommand $sqlcommands; $sqlcommands = #()
}
}
Run-DBCommand $sqlcommands
if ($numgroups -eq $count) {
Run-DBCommand #("DELETE FROM $table WHERE [Last_Update] <= '$starttime'")
}
Write-Progress " " " " -Completed
}
Open-Database
Run-GroupMemberExport "users"
Run-GroupMemberExport "computers"
Run-GroupDescriptionExport
Close-Database
This doesn't have anything to do with the PowerShell version. You're just plain running out of memory. You're pulling in a lot of data, so you need to be more conscious of getting rid of that data when you're done with it.
There are a couple things you can do to clean up memory:
First, the documentation for DirectorySearcher.FindAll() says:
Due to implementation restrictions, the SearchResultCollection class cannot release all of its unmanaged resources when it is garbage collected. To prevent a memory leak, you must call the Dispose method when the SearchResultCollection object is no longer needed.
So whenever you do:
$results = $srch.FindAll()
Make sure you call $results.Dispose() when you're done with it (at the end of the function).
Second, when you loop through the results in your Run-GroupMemberExport function, you're calling $res.GetDirectoryEntry(). Usually you can just let the garbage collector clean up DirectoryEntry objects, but when you're creating so many in a loop like that, the GC doesn't have time to run. This has happened to me when I've run a loop over thousands of accounts.
To solve this, you can call Dispose() on the DirectoryEntry objects yourself. Since you already have a try/catch block there, I would suggest adding a finally block to make sure it happens even if an error is thrown:
try {
...
} catch {
Write-Logfile "Error`tCommand failed while processing $exportmode`: $($objAccount.name), $($_.exception.message)"
} finally {
$objAccount.Dispose()
}
Actually, you could probably just not use GetDirectoryEntry() at all. Just ask the DirectorySearcher to return the other attributes you need. But if you want to still use it, then make sure you call RefreshCache for every attribute you need (you can put them all in one call to RefreshCache). If you access the Properties collection and ask for a value that it does not already have in cache, then it will ask AD for every attribute with a value - that's a lot of unnecessary data.
So currently, I know you can grab an account using the ldap filter for something like $adSearchFilter = "(&(objectCategory=User)(samAccountType:1.2.840.113556.1.4.803:=805306368)(SamAccountName='Account1')). Is there a way the ldap filter will allow you to pass through a list of names, like instead of using = I can use something like -contains?
Below is the code, and as you can see, it searches one user at a time for the whole search process in a foreach loop...
Function GetUsersInfoFromDomain
{
Param ([String]$searchPropertyName, [String[]]$searchPropertyValues, [String[]]$DcWithCred,[String]$domainShortName, [String[]]$userProperties)
$queryTable = #()
ForEach ($searchPropertyValue in $searchPropertyValues)
{
$adSearchFilter = "(&(objectCategory=User)(samAccountType:1.2.840.113556.1.4.803:=805306368)($searchPropertyName=$searchPropertyValue))"
Write-Host "Searching domain $domainShortName with $searchPropertyName $searchPropertyValue"
$searchDomainResultsTable = powershell -command {
Param ([String]$adSearchFilter, [String[]]$userProperties,[String[]]$DcWithCred, [String]$domainShortName)
[string]$DC = $DcWithCred[0]
[string]$Username = $DcWithCred[1]
[string]$Password = $DcWithCred[2]
[string]$domain = "LDAP://$DC"
$adDomain = New-Object System.DirectoryServices.DirectoryEntry($domain, $Username, $Password)
$adSearcher = New-Object System.DirectoryServices.DirectorySearcher($adDomain)
$adSearcher.Filter = $adSearchFilter
$adSearcher.PageSize=1000
$adSearcher.PropertiesToLoad.AddRange($userProperties) | out-Null
$userRecords = $adSearcher.FindAll()
$adSearcher.Dispose() | Out-Null
[System.GC]::Collect() | Out-Null
# The AD results are converted to an array of hashtables.
$userPropertiesTable = #()
foreach($record in $userRecords) {
$hashUserProperty = #{}
foreach($userProperty in $userProperties){
if (($userProperty -eq 'objectGUID') -or ($userProperty -eq 'objectSid') -or ($userProperty -eq 'msExchMasterAccountSid')) {
if ($record.Properties[$userProperty]) {
$hashUserProperty.$userProperty = $record.Properties[$userProperty][0]
} else {
$hashUserProperty.$userProperty = $null
}
} Else {
if ($record.Properties[$userProperty]) {
$hashUserProperty.$userProperty = ($record.Properties[$userProperty] -join '; ').trim('; ')
} else {
$hashUserProperty.$userProperty = $null
}
} #end Else
} #end ForEach
$userPropertiesTable += New-Object PSObject -Property $hashUserProperty
} #end ForEach
[System.GC]::Collect() | Out-Null
# Fixes the property values to be a readable format before exporting to csv file
$listOfBadDateValues = '9223372036854775807', '9223372036854770000', '0'
$maxDateValue = '12/31/1600 5:00 PM'
$valuesToFix = #('lastLogonTimestamp', 'AccountExpires', 'LastLogon', 'pwdLastSet', 'objectGUID', 'objectSid', 'msExchMasterAccountSid')
$extraPropertyValues = #('Domain Name')
$valuesToFixCounter = 0
$extraPropertyValuesCounter = 0
$valuesToFixFound = #($false, $false, $false, $false, $false, $false, $false)
$extraPropertyValuesFound = #($false)
ForEach ($valueToFix in $valuesToFix)
{
if ($userProperties -contains $valueToFix)
{
$valuesToFixFound[$valuesToFixCounter] = $true
}
$valuesToFixCounter++
}
ForEach ($extraPropertyValue in $extraPropertyValues)
{
if ($userProperties -contains $extraPropertyValue)
{
$extraPropertyValuesFound[$extraPropertyValuesCounter] = $true
}
$extraPropertyValuesCounter++
}
$tableFixedValues = $userPropertiesTable | % {
if ($valuesToFixFound[0]) {
if ($_.lastLogonTimestamp) {
$_.lastLogonTimestamp = ([datetime]::FromFileTime($_.lastLogonTimestamp)).ToString('g')
}
}; if ($valuesToFixFound[1]) {
if (($_.AccountExpires) -and ($listOfBadDateValues -contains $_.AccountExpires)) {
$_.AccountExpires = ""
} else {
if (([datetime]::FromFileTime($_.AccountExpires)).ToString('g') -eq $maxDateValue) {
$_.AccountExpires = ""
} Else {
$_.AccountExpires = ([datetime]::FromFileTime($_.AccountExpires)).ToString('g')
}
}
}; if ($valuesToFixFound[2]) {
if (($_.LastLogon) -and ($listOfBadDateValues -contains $_.LastLogon)) {
$_.LastLogon = ""
} else {
if (([datetime]::FromFileTime($_.LastLogon)).ToString('g') -eq $maxDateValue) {
$_.LastLogon = ""
} Else {
$_.LastLogon = ([datetime]::FromFileTime($_.LastLogon)).ToString('g')
}
}
}; if ($valuesToFixFound[3]) {
if (($_.pwdLastSet) -and ($listOfBadDateValues -contains $_.pwdLastSet)) {
$_.pwdLastSet = ""
} else {
if (([datetime]::FromFileTime($_.pwdLastSet)).ToString('g') -eq $maxDateValue) {
$_.pwdLastSet = ""
} Else {
$_.pwdLastSet = ([datetime]::FromFileTime($_.pwdLastSet)).ToString('g')
}
}
}; if ($valuesToFixFound[4]) {
if ($_.objectGUID) {
$_.objectGUID = ([guid]$_.objectGUID).Guid
} Else {
$_.objectGUID = ""
}
}; if ($valuesToFixFound[5]) {
if ($_.objectSid) {
$_.objectSid = (New-Object Security.Principal.SecurityIdentifier($_.objectSid, 0)).Value
} Else {
$_.objectSid = ""
}
}; if ($valuesToFixFound[6]) {
if ($_.msExchMasterAccountSid) {
$_.msExchMasterAccountSid = (New-Object Security.Principal.SecurityIdentifier($_.msExchMasterAccountSid, 0)).Value
} Else {
$_.msExchMasterAccountSid = ""
}
}; If ($extraPropertyValuesFound[0]) {
If (!($_.'Domain Name')) {
$_.'Domain Name' = $domainShortName
}
};$_}
[System.GC]::Collect() | Out-Null
$sortedTableColumns = $tableFixedValues | Select-Object $userProperties
[System.GC]::Collect() | Out-Null
return $sortedTableColumns
} -args $adSearchFilter, $userProperties, $DcWithCred, $domainShortName
[System.GC]::Collect() | Out-Null
Write-Host "Search Complete."
Write-Host ""
if ($searchDomainResultsTable)
{
$queryTable += $searchDomainResultsTable
}
} # End ForEach Loop
Write-Host 'Exporting domain search results to table...'
Write-Output $queryTable
}
I thought about doing something like $adSearchFilter += "($searchPropertyName=$searchPropertyValue)". However due to the 10mb limit - What is the LDAP filter string length limit in Active Directory?, I'm not sure if this would be the best method while looking up 200,000++ users.
Does anyone know a way to pass a list instead of 1 string value per search?
LDAP doesn't have a -contains-like statement, but you can use the OR operator (|) to construct a filter expression that matches multiple exact values:
(|(samaccountname=user1)(samaccountname=user2)(samaccountname=user3))
This is how I would build the filter string:
$FilterTemplate = '(&(objectCategory=User)(samAccountType:1.2.840.113556.1.4.803:=805306368){0})'
$ClauseTemplate = "($searchPropertyName={0})"
$AllClauses = $searchPropertyValues |ForEach-Object { $ClauseTemplate -f $_ }
$adSearchFilter = $FilterTemplate -f $($AllClauses -join '')
That being said, why would you pass 200000 specific values to search for in a single search? LDAP supports wildcard matching (eg. (samaccountname=*)).
In any case, you could calculate the final size of your string, by calling Encoding.GetByteCount on the biggest string in $AllClauses, and then use that to partition the array (let's cap it at 9.5 MB to be on the safe side):
$LongestString = $AllClauses |Sort -Property Length |Select -Last 1
$LongestByteCount = [System.Text.Encoding]::Unicode.GetByteCount($LongestString)
if(($LongestByteCount * $AllClauses.Count) -gt 9.5MB)
{
$MaxCount = [int](9.5MB / $LongestByteCount)
for($i = 0; $i -lt $AllClauses.Count; $i += $MaxCount)
{
$ClauseSubset = $AllClauses[$i..$($i + $MaxCount - 1)]
$adSearchFilter = $FilterTemplate -f $($ClauseSubset -join '')
# Do your search
}
}