DSC to Configure an IIS App Pool - powershell

We've been researching desired state configuration, and I was asked to set up a prototype using powershell DSC to configure an IIS app pool. I know all the steps to creating a configuration, I just am unsure of what I might have in my configuration. I plan to use the xWebAdministration resource because it has things like xWebAppPool and xWedbAdministration. Are there any suggestions on what else I might use to set this up?

If you have many sites that you are trying to bring under configuration control, you could use my DSC generator to produce the DSC for IIS Features, Web sites, app pools, and virtual directories.
I then use Octopus Deploy to deliver the DSC to the server and apply the DSC.
https://github.com/kevinsea/dsc-generator

You would probably use the WindowsFeature resource to install the Roles and Features needed (Web-Server, etc.), you'd probably use the File resource to create the directory and maybe copy the site's files, you might use the Registry resource to enable the Web Management Service so that you can manage IIS remotely, then use the Service resource to start that service.

Some time ago I have been tasked with exactly the same challenge and I have created a prototype of DSC resource that would acomplish this. After initial tests this runs now in production.
Source code is on gihub (https://github.com/RafPe/cWebAdmin ) and all feedback would be more than welcome :)
Maybe this would give you an idea how to challenge this on your end

*Realize this is quite old, but just stumbled across it and wanted to offer recommendations for the scenario your DSCs get big/unwieldly.
I'd start off by separating your your node definition file from your configuration file then map a reference. I keep my files side-by-side, so I'd reference like below. I won't go into encryption, good MS guidance on that.
$File = 'WebApp\NodeDefinitions.psd1'
$Parent = Split-Path -Parent $PSScriptRoot
$Path = Join-Path $Parent $File
WebApp -RunAs $RunAs -ConfigurationData "$Path" -OutputPath $localpath -verbose
Now to make things a little more dynamic you'd create arrays or hash table collections in your node definition file like below for folders, sites, etc. *I did have to alter xcertificatedsc to have it pass an array of accounts, which isn't difficult if you need an assist.
#{
NodeName = "Vm-Web-1"
Role = "DevWeb","Web","WS","SMP"
IUSRS = "Domain\User1$","Domain\User2$"
Folders = #(
("Dir1","F:\inetpub\wwwroot\SomeApp","Present"),
("Dir2","F:\inetpub\wwwroot\SomeApp2","Present"),
("Dir3","F:\inetpub\wwwroot\SomeApp2","Absent") #If something's moved or a mistake
)
Sites = #(
#("MyApp1","Domain\User1$","Present"),
#("MyApp2","Domain\User2$","Present")
)
CertPerms= #{
"somecert#domain.com" = #("Domain\User1$","Domain\User2$")
}
},
Then I introduce non-node data in my configuration so that sites are more portable that we'll later join in the configuration ps1 file. I'll also show how I'd iterate folder creation.
#{
Name = "MyApp1"
PoolConfigName = "ApMyApp1"
PoolName = "ApMyApp1"
PoolRtVer = "v4.0"
SiteConfigName = "WaMyApp1"
SitePath = "ApMyApp1"
SiteName = "ApMyApp1"
SiteDepends = "[File]Dir1"
},
#{
Name = "MyApp2"
PoolConfigName = "ApMyApp2"
PoolName = "ApMyApp2"
PoolRtVer = "v4.0"
SiteConfigName = "WaMyApp2"
SitePath = "ApMyApp2"
SiteName = "ApMyApp2"
SiteDepends = "[File]Dir2"
},
To iterate over folders, you just reference elements of your array in the nodedefintion file
Foreach($Folder in $Node.Folders){
File $Folder[0]
{
Ensure = $Folder[2]
Type = "Directory"
DestinationPath = $Folder[1]
}
}
The site joining is a little more complex and I'm not overly happy with my current situation referencing the element instead of an easier to read name, but it works. After joining node/nonnode data, the next part of the script is for lesser used parameters. We don't introduce a lot of pools in always running mode for example (unless proper page initialization is validated in the web.config of the correlating app). If I'm introducing a site and the correlating service account isn't active on the domain, I'll make sure the correlating pool is stopped so it doesn't flood the iis worker process. Otherwise you should be able to map out what's set in the node section (array references) vs the non-node site data.
Foreach($SiteName in $Node.Sites){
$Site = $ConfigurationData.Sites.Where{$_.Name -eq $SiteName[0]}
if ([string]::IsNullOrWhiteSpace($Site.PoolIdleTO))
{
$PoolIdleTO = 20
}
else
{
$PoolIdleTO = $Site.PoolIdleTO
}
if ([string]::IsNullOrWhiteSpace($Site.PoolStartMode))
{
$PoolStartMode = "OnDemand"
}
else
{
$PoolStartMode = $Site.PoolStartMode
}
if ([string]::IsNullOrWhiteSpace($SiteName[3]))
{
$State = "Started"
}
else
{
$State = $SiteName[3]
}
xWebAppPool $Site.PoolConfigName
{
Name = $Site.PoolName
Ensure = $SiteName[2]
State = $State
autoStart = $true
enable32BitAppOnWin64 = $false
enableConfigurationOverride = $true
managedPipelineMode = "Integrated"
managedRuntimeVersion = $Site.PoolRtVer
startMode = $PoolStartMode
queueLength = 1000
cpuAction = "KillW3wp"
cpuLimit = 95000
cpuResetInterval = (New-TimeSpan -Minutes 1).ToString()
cpuSmpAffinitized = $false
cpuSmpProcessorAffinityMask = 4294967295
cpuSmpProcessorAffinityMask2 = 4294967295
identityType = 'SpecificUser'
Credential = New-Object System.Management.Automation.PSCredential($SiteName[1], (ConvertTo-SecureString $Node.GmsaPwd.ToString() -AsPlainText -Force))
idleTimeout = (New-TimeSpan -Minutes $PoolIdleTO).ToString()
idleTimeoutAction = 'Suspend'
loadUserProfile = $false
logEventOnProcessModel = 'IdleTimeout'
logonType = 'LogonBatch'
manualGroupMembership = $false
maxProcesses = 1
pingingEnabled = $true
pingInterval = (New-TimeSpan -Seconds 30).ToString()
pingResponseTime = (New-TimeSpan -Seconds 90).ToString()
setProfileEnvironment = $false
shutdownTimeLimit = (New-TimeSpan -Seconds 90).ToString()
startupTimeLimit = (New-TimeSpan -Seconds 90).ToString()
orphanActionExe = ''
orphanActionParams = ''
orphanWorkerProcess = $false
loadBalancerCapabilities = 'HttpLevel'
rapidFailProtection = $true
rapidFailProtectionInterval = (New-TimeSpan -Minutes 1).ToString()
rapidFailProtectionMaxCrashes = 5
autoShutdownExe = 'C:\Windows\System32\iisreset.exe'
autoShutdownParams = ''
disallowOverlappingRotation = $false
disallowRotationOnConfigChange = $false
logEventOnRecycle = 'Time,Requests,Schedule,Memory,IsapiUnhealthy,OnDemand,ConfigChange,PrivateMemory'
restartMemoryLimit = 3221225472
restartPrivateMemoryLimit = 8000000
restartRequestsLimit = 20000000
restartTimeLimit = (New-TimeSpan -Minutes 0).ToString()
restartSchedule = "00:00:00"
DependsOn = '[WindowsFeature]IIS'
}
<#!!!Imperative method (runs immediately) to ensure service accounts get IIS metabase access!!#
#Need to move this into the function with a flag, obviously there'll be looping challenges...
Invoke-Command -Session (New-PSSession -ComputerName $Node.NodeName -Credential $RunAs -SessionOption (New-PSSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck)) -ScriptBlock {
param ([string] $User)
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet_regiis.exe -ga $User
} -ArgumentList $SiteName[1]
#>
cIisAccess "IisMetabaseAccess$($Site.SiteConfigName + $SiteName[1])"
{
Account = $SiteName[1]
Ensure="Present"
}
xWebApplication $Site.SiteConfigName
{
Website = "Default Web Site"
Name = $Site.SiteName
WebAppPool = $Site.PoolName
PhysicalPath = $Node.DfSitePath + $Site.SitePath
Ensure = $SiteName[2]
PreloadEnabled = $true
DependsOn = "[xWebAppPool]$($Site.PoolConfigName)",$($Site.SiteDepends)
}
}
Where one thing to note is I use GMSAs so there's no password and a kerberos token is used. You'd still need a phony password reference due to PSCredential requirements, so you can just add something to your allnodes data like below for reference:
GmsaPwd = "none"
There's also lots of guidance on using roles out there, but a simple ref is below.
Node $AllNodes.Where{$_.Role -contains "Web"}.NodeName
{
#Embed site/folder iterators here if preferred
}

Related

Basic question about using PowerShell to modify user assignments in SSRS

I have somple experience in PowerShell but I don't have experience in using it to automate SQL Server Reporting Service. Basically I want to assign a user a role to a particular report object in SSRS. I have found the following codes in
SSRS: How to assign multiple users a role to a report quickly?
It seems a good start for creating my script.
function Add-SSRSUserRole
(
[string]$reportServerUrl,[string]$userGroup,[string]$requiredRole,[string]$folder,[bool]$inheritFromParent
)
{
#Ensure we stop on errors
$ErrorActionPreference = "Stop";
#Connect to the SSRS webservice
$ssrs = New-WebServiceProxy -Uri "$reportServerUrl" -UseDefaultCredential;
$namespace = $ssrs.GetType().Namespace;
$changesMade = $false;
#Look for a matching policy
$policies = $ssrs.GetPolicies($folder, [ref]$inheritFromParent);
if ($policies.GroupUserName -contains $userGroup)
{
Write-Host "User/Group already exists. Using existing policy.";
$policy = $policies | where {$_.GroupUserName -eq $userGroup} | Select -First 1 ;
}
else
{
#A policy for the User/Group needs to be created
Write-Host "User/Group was not found. Creating new policy.";
$policy = New-Object -TypeName ($namespace + '.Policy');
$policy.GroupUserName = $userGroup;
$policy.Roles = #();
$policies += $policy;
$changesMade = $true;
}
#Now we have the policy, look for a matching role
$roles = $policy.Roles;
if (($roles.Name -contains $requiredRole) -eq $false)
{
#A role for the policy needs to added
Write-Host "Policy doesn't contain specified role. Adding.";
$role = New-Object -TypeName ($namespace + '.Role');
$role.Name = $requiredRole;
$policy.Roles += $role;
$changesMade = $true;
}
else
{
Write-Host "Policy already contains specified role. No changes required.";
}
#If changes were made...
if ($changesMade)
{
#...save them to SSRS
Write-Host "Saving changes to SSRS.";
$ssrs.SetPolicies($folder, $policies);
}
Write-Host "Complete.";
}
[string]$url = "http://localhost/ReportServer/ReportService2006.asmx?wsdl";
Add-SSRSUserRole $url "Everyone" "Browser" "/MyReportFolder" $true;
Add-SSRSUserRole $url "Domain\User" "Browser" "/MyReportFolder" $true;
Now I have two elementary questions:
Do I need any SSRS modules to be installed in my PowerShell in order to run the above script?
The sample code above assign a permission to a folder. What changes are required if I want to assign permissions to a report object directly instead?
Thanks for your response in advance,

Get the list of azure web app settings to be swapped using PowerShell

When we perform the swap through the azure portal, it gives us the warning & informative messages regarding the settings to be swapped. Just like in below image:
My question is, is there any way I can get these messages list (regarding the settings) through PowerShell code?
I tried googling it but couldn't find any way.
The direct solution is to use the Invoke-RestMethod to request the API captured in the Portal. The problem is that this is a non-public API, and we don't know if it has changed.
You could use PowerShell to get the two objects to be exchanged, gets their appSettings and ConnectionStrings, and compares them.
The following is a script for reference.
When Source and Destination are different, the script can get:
• To be added in the Destination
• The destination to be removed from the Destination
• Exchanged
$rsgName = "xxxxx"
$appName = "xxxxx"
$slotName = "xxxxxx"
$destination = Get-AzureRmWebApp -ResourceGroupName $rsgName -Name $appName
$destinationAppSettings = $destination.SiteConfig.AppSettings
$destinationConnectionStrings = $destination.SiteConfig.ConnectionStrings
$source = Get-AzureRmWebAppSlot -ResourceGroupName $rsgName -Name $appName -Slot $slotName
$sourceAppSettings = $source.SiteConfig.AppSettings
$sourceConnectionStrings = $source.SiteConfig.ConnectionStrings
#Get slot configurations
$slotConfigure = Get-AzureRmWebAppSlotConfigName -ResourceGroupName $rsgName -Name $appName
$toBeAdded = New-Object System.Collections.ArrayList
$toBeSwapped = New-Object System.Collections.ArrayList
$toBeDeleted = New-Object System.Collections.ArrayList
foreach($appSetting in $sourceAppSettings){
if(-not $slotConfigure.AppSettingNames.Contains($sourceAppSettings.Name)){
$flag = $true
foreach($_appSetting in $destinationAppSettings){
if($_appSetting.Name -eq $appSetting.Name){
$flag = $false
[void]$toBeSwapped.Add([pscustomobject]#{Name = $appSetting.Name; Source = $appSetting.Value; Destination = $_appSetting.Value})
}
}
if($flag){
[void]$toBeAdded.Add($appSetting)
}
}
}
foreach($appSetting in $destinationAppSettings){
$flag = $true
foreach($_appSetting in $sourceAppSettings){
if($_appSetting.Name -eq $appSetting.Name){
$flag = $false
}
}
if($flag){
[void]$toBeDeleted.Add($appSetting)
}
}
# AppSettings
# To be added to destination
$toBeAdded
# To be swapped to destination
$toBeSwapped
# To be delete in destination
$toBeDeleted
$toBeAdded = New-Object System.Collections.ArrayList
$toBeSwapped = New-Object System.Collections.ArrayList
$toBeDeleted = New-Object System.Collections.ArrayList
foreach($connectionString in $sourceConnectionStrings){
if(-not $slotConfigure.ConnectionStringNames.Contains($connectionString.Name)){
$flag = $true
foreach($_connectionString in $destinationConnectionStrings){
if($_connectionString.Name -eq $connectionString.Name){
$flag = $false
[void]$toBeSwapped.Add([pscustomobject]#{Name = $connectionString.Name; Source = $connectionString.Value; Destination = $_connectionString.Value})
}
}
if($flag){
[void]$toBeAdded.Add($connectionString)
}
}
}
foreach($connectionString in $destinationConnectionStrings){
$flag = $true
foreach($_connectionString in $sourceConnectionStrings){
if($_connectionString.Name -eq $connectionString.Name){
$flag = $false
}
}
if($flag){
[void]$toBeDeleted.Add($connectionString)
}
}
# ConnectionStrings
# To be added to destination
$toBeAdded
# To be swapped to destination
$toBeSwapped
# To be delete in destination
$toBeDeleted
Hope it helps you.

Azure New-AzureRmRecoveryServicesBackupProtectionPolicy fails with error "The specified resource does not exist"

I'm attempting to enable IaaS VM backup in Azure using a Recovery Services Vault and it fails when attempting to create a new Protection Policy using Azure New-AzureRmRecoveryServicesBackupProtectionPolicy.
The script has worked for a previous subscription and VMs, so I'm unclear why it doesn't work for this subscription. I've run Azure New-AzureRmRecoveryServicesBackupProtectionPolicy -Debug which returns the below additional information, unfortunately that's not enough to highlight and resolve the problem either:
"error": {
"code": "InvalidRestApiParameter",
"message": "stampId parameter is invalid.\r\nPlease provide a valid stampId",
"target": null,
"details": null,
"innerError": null
}
Here's the code which attempts to create the Protection Policy:
# Create Retention Policy object. Has to be modified from existing 'default' values provided by Azure
$RetPol = Get-AzureRmRecoveryServicesBackupRetentionPolicyObject -WorkloadType "AzureVM"
$BackupTime = (Get-Date).ToUniversalTime().Date.AddHours(23)
$Day = $true
$DayTime = $BackupTime
$DayRet = 7
$Week = $true
$WeekDay = 'Sunday'
$WeekTime = $BackupTime
$WeekRet = 5
$Month = $true
$MonthType = 'Daily'
$MonthTime = $BackupTime
$MonthDay = New-Object -TypeName PSObject -Property #{
Date = 0;
IsLast = $true;
}
$MonthRet = 3
$Year = $false
$RetPol.IsDailyScheduleEnabled = $Day
$RetPol.DailySchedule.DurationCountInDays = $DayRet
$RetPol.DailySchedule.RetentionTimes[0] = $DayTime
$RetPol.IsWeeklyScheduleEnabled = $Week
$RetPol.WeeklySchedule.DaysOfTheWeek = $WeekDay
$RetPol.WeeklySchedule.DurationCountInWeeks = $WeekRet
$RetPol.WeeklySchedule.RetentionTimes[0] = $WeekTime
$RetPol.IsMonthlyScheduleEnabled = $Month
$RetPol.MonthlySchedule.RetentionScheduleFormatType = $MonthType
$RetPol.MonthlySchedule.RetentionScheduleDaily.DaysOfTheMonth = $MonthDay
$RetPol.MonthlySchedule.DurationCountInMonths = $MonthRet
$RetPol.MonthlySchedule.RetentionScheduleWeekly = $null
$RetPol.MonthlySchedule.RetentionTimes[0] = $MonthTime
$RetPol.IsYearlyScheduleEnabled = $Year
$RetPol.YearlySchedule = $null
# Create Schedule Policy object.
$SchPol = Get-AzureRmRecoveryServicesBackupSchedulePolicyObject -WorkloadType "AzureVM"
$SchPol.ScheduleRunFrequency = "Daily"
$SchPol.ScheduleRunDays = $null
$SchPol.ScheduleRunTimes[0] = $BackupTime
#Create the new Backup Policy
$BackupPolicy = New-AzureRmRecoveryServicesBackupProtectionPolicy -WorkloadType AzureVM -Name 'MyPolicy' -RetentionPolicy $RetPol -SchedulePolicy $SchPol
Any help or thoughts greatly appreciated.
TL;DR: Deleted and re-created the Recovery Services Vault via PowerShell
Full Description
Turned-out that something had previously gone wrong with the creation of the Recovery Services Vault, which wasn't clear when I'd run New-AzureRmRecoveryServicesVault but had resulted in the following view of the RSV when opened via the Portal:
The RSV wouldn't delete via the portal so I had to use Remove-AzureRmRecoveryServicesVault to remove it. I then re-created it, which resolved the error and allowed me to backup the VMs

EWS Powershell Exchange 2013 FindFolders returns 0 results

I am currently trying to make this script run on exchange 2013 to convert folder types from IPF.IMAP to IPF.NOTE as the folders are not showing on mobile devices after being imported from Imap. This script returns 0 results after running and multiple Doesnt Exist. If I output the folder names they are coming through, so i am not sure why the FindFolders is not returning any results.
I tried turning on impersonation (commented out here) but get an error saying I do not have permissions to impersonate even though I am logged in as administrator and running on powershell as admin. I am not sure if this is even necessary as the script works fine and returns the folder names for both $mbxfolder.Name and $SfSearchFilter, but only until it hits the FindFolders line, then the TotalCount is always 0.
Import-Module -Name "C:\Program Files\Microsoft\Exchange\Web Services\1.2\Microsoft.Exchange.WebServices.dll"
$exchService = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService
$exchService.UseDefaultCredentials = $true
$exchService.AutodiscoverUrl('email#domain.com', {$true})
$MBXID = "email#domain.com" #Define mailboxID
foreach ($MailboxIdentity in $MBXID) {
Write-Host "Searching for $MailboxIdentity"
$mailbox = (Get-Mailbox -Identity $MailboxIdentity)
$MailboxName = (Get-Mailbox -Identity $MailboxIdentity).PrimarySmtpAddress.ToString()
$MailboxRootid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Root,$MailboxName) #MsgFolderRoot selection and creation of new root
$MailboxRoot = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($exchService,$MailboxRootid)
#$exchService.ImpersonatedUserId = New-Object Microsoft.Exchange.WebServices.Data.ImpersonatedUserId -ArgumentList ([Microsoft.Exchange.WebServices.Data.ConnectingIdType]::SmtpAddress),$MailboxName #Define impersonation
$folderid = $MailboxRootid
$f1 = $MailboxRoot
$fold = get-mailboxfolderstatistics $MailboxIdentity #Getting complete list of selected mailbox
foreach ($mbxfolder in $fold){
#Define Folder View Really only want to return one object
$fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(100) #page size for displayed folders
$fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep; #Search traversal selection Deep = recursively
#Define a Search folder that is going to do a search based on the DisplayName of the folder
$SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo([Microsoft.Exchange.WebServices.Data.FolderSchema]::Displayname,$MBXFolder.name) #for each folder in mailbox define search
$findFolderResults = $MailboxRoot.FindFolders($SfSearchFilter,$fvFolderView) #for each folder in mailbox define folder view (this is online task for store.exe) and perform search
if ($findFolderResults.TotalCount -eq 0){ "Folder Doesn't Exist" } #Info if folder still exist
else {"Folder Exist"
ForEach ($Folder in $findFolderResults.Folders) { #for each folder in folder results perform check of folder class
$folder.folderclass #Info about folder class
if ($Folder.folderclass -eq "IPF.Imap"){ #If folder class is target type, do change and update
$Folder.folderclass = "IPF.Note" #Folder class change in variable
Write-Host "Updating folder $folder.name to correct type IPF.Note. Folder will start to be visible in OWA"
$Folder.update() #Folder class update in mailbox via EWS
}
}
}
}
}
It doesn't really make much sense to enumerate the the folders using Get-MailboxFolderStatistics and then search for each folder in EWS. This is going to really slow and unnecessary (you have the folderId anyway from Get-MailboxFolderStatistics so you can just convert that and bind to it). However I would
Get rid of Get-MailboxFolderStatistics altogether and just use straight EWS to enumerate the Folders in the Mailbox and do your fixes as this will be much quicker eg
## Get the Mailbox to Access from the 1st commandline argument
$MailboxName = $args[0]
## Load Managed API dll
###CHECK FOR EWS MANAGED API, IF PRESENT IMPORT THE HIGHEST VERSION EWS DLL, ELSE EXIT
$EWSDLL = (($(Get-ItemProperty -ErrorAction SilentlyContinue -Path Registry::$(Get-ChildItem -ErrorAction SilentlyContinue -Path 'Registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Exchange\Web Services'|Sort-Object Name -Descending| Select-Object -First 1 -ExpandProperty Name)).'Install Directory') + "Microsoft.Exchange.WebServices.dll")
if (Test-Path $EWSDLL)
{
Import-Module $EWSDLL
}
else
{
"$(get-date -format yyyyMMddHHmmss):"
"This script requires the EWS Managed API 1.2 or later."
"Please download and install the current version of the EWS Managed API from"
"http://go.microsoft.com/fwlink/?LinkId=255472"
""
"Exiting Script."
exit
}
## Set Exchange Version
$ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::Exchange2007_SP1
## Create Exchange Service Object
$service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)
## Set Credentials to use two options are availible Option1 to use explict credentials or Option 2 use the Default (logged On) credentials
#Credentials Option 1 using UPN for the windows Account
$psCred = Get-Credential
$creds = New-Object System.Net.NetworkCredential($psCred.UserName.ToString(),$psCred.GetNetworkCredential().password.ToString())
$service.Credentials = $creds
#$service.TraceEnabled = $true
#Credentials Option 2
#service.UseDefaultCredentials = $true
## Choose to ignore any SSL Warning issues caused by Self Signed Certificates
## Code From http://poshcode.org/624
## Create a compilation environment
$Provider=New-Object Microsoft.CSharp.CSharpCodeProvider
$Compiler=$Provider.CreateCompiler()
$Params=New-Object System.CodeDom.Compiler.CompilerParameters
$Params.GenerateExecutable=$False
$Params.GenerateInMemory=$True
$Params.IncludeDebugInformation=$False
$Params.ReferencedAssemblies.Add("System.DLL") | Out-Null
$TASource=#'
namespace Local.ToolkitExtensions.Net.CertificatePolicy{
public class TrustAll : System.Net.ICertificatePolicy {
public TrustAll() {
}
public bool CheckValidationResult(System.Net.ServicePoint sp,
System.Security.Cryptography.X509Certificates.X509Certificate cert,
System.Net.WebRequest req, int problem) {
return true;
}
}
}
'#
$TAResults=$Provider.CompileAssemblyFromSource($Params,$TASource)
$TAAssembly=$TAResults.CompiledAssembly
## We now create an instance of the TrustAll and attach it to the ServicePointManager
$TrustAll=$TAAssembly.CreateInstance("Local.ToolkitExtensions.Net.CertificatePolicy.TrustAll")
[System.Net.ServicePointManager]::CertificatePolicy=$TrustAll
## end code from http://poshcode.org/624
## Set the URL of the CAS (Client Access Server) to use two options are availbe to use Autodiscover to find the CAS URL or Hardcode the CAS to use
#CAS URL Option 1 Autodiscover
$service.AutodiscoverUrl($MailboxName,{$true})
"Using CAS Server : " + $Service.url
#CAS URL Option 2 Hardcoded
#$uri=[system.URI] "https://casservername/ews/exchange.asmx"
#$service.Url = $uri
## Optional section for Exchange Impersonation
#$service.ImpersonatedUserId = new-object Microsoft.Exchange.WebServices.Data.ImpersonatedUserId([Microsoft.Exchange.WebServices.Data.ConnectingIdType]::SmtpAddress, $MailboxName)
#Define Function to convert String to FolderPath
function ConvertToString($ipInputString){
$Val1Text = ""
for ($clInt=0;$clInt -lt $ipInputString.length;$clInt++){
$Val1Text = $Val1Text + [Convert]::ToString([Convert]::ToChar([Convert]::ToInt32($ipInputString.Substring($clInt,2),16)))
$clInt++
}
return $Val1Text
}
#Define Extended properties
$PR_FOLDER_TYPE = new-object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(13825,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer);
$folderidcnt = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$MailboxName)
#Define the FolderView used for Export should not be any larger then 1000 folders due to throttling
$fvFolderView = New-Object Microsoft.Exchange.WebServices.Data.FolderView(1000)
#Deep Transval will ensure all folders in the search path are returned
$fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep;
$psPropertySet = new-object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties)
$PR_Folder_Path = new-object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(26293, [Microsoft.Exchange.WebServices.Data.MapiPropertyType]::String);
#Add Properties to the Property Set
$psPropertySet.Add($PR_Folder_Path);
$fvFolderView.PropertySet = $psPropertySet;
#The Search filter will exclude any Search Folders
$sfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo($PR_FOLDER_TYPE,"1")
$fiResult = $null
#The Do loop will handle any paging that is required if there are more the 1000 folders in a mailbox
do {
$fiResult = $Service.FindFolders($folderidcnt,$sfSearchFilter,$fvFolderView)
foreach($ffFolder in $fiResult.Folders){
$foldpathval = $null
#Try to get the FolderPath Value and then covert it to a usable String
if ($ffFolder.TryGetProperty($PR_Folder_Path,[ref] $foldpathval))
{
$binarry = [Text.Encoding]::UTF8.GetBytes($foldpathval)
$hexArr = $binarry | ForEach-Object { $_.ToString("X2") }
$hexString = $hexArr -join ''
$hexString = $hexString.Replace("FEFF", "5C00")
$fpath = ConvertToString($hexString)
}
"FolderPath : " + $fpath
"Folder Class : " + $ffFolder.FolderClass
}
$fvFolderView.Offset += $fiResult.Folders.Count
}while($fiResult.MoreAvailable -eq $true)
Cheers
Glen

Powershell Bulk Find ActiveDirectory Objects

I'm trying to develop a powershell script to help with AD Group Membership management. We have a handful of large groups (30k-60k+ objects) that we want to update with data from another system.
The script loads the objects that should be in the group from a text file. Each object then has to located in AD using a System.DirectoryServices.DirectorySearcher. After that each object is added to the group membership.
The script spends some 80% of its time looking up each object, is there a bulk way to find objects in AD with powershell?
Thanks!
This is the fast way to query AD that I found in my experience, you need to change the query to find specific objects, in this code you'll find all user/person object in $objRecordSet.
$Ads_Scope_SubTree = 2
$objConnection = new-Object -com "ADODB.Connection"
$objCommand = new-Object -com "ADODB.Command"
$objConnection.Provider = "ADsDSOObject"
$objConnection.Open( "Active Directory Provider")
$objCommand.ActiveConnection = $objConnection
$objCommand.Properties.Item("Page Size").value = 1000
$objCommand.Properties.item("Searchscope").value = $Ads_Scope_SubTree
$objCommand.CommandText = "Select Name From 'LDAP://DC = int, DC= my, DC = local' Where objectCategory = 'Person'"
$objRecordSet = $objCommand.Execute()
$objRecordSet.RecordCount
More info here
You perhaps can try System.DirectoryServices.Protocols (S.DS.P) the native (non managed) version is quite efficient.
Here is a PowerShell starting script :
# ADDP-Connect.PS1
Clear-Host
# Add the needed assemblies
Add-Type -AssemblyName System.DirectoryServices.Protocols
# Connexion
$serverName = "WM2008R2ENT"
$ADDPConnect = New-Object System.DirectoryServices.Protocols.LdapConnection $serverName
$userName = "JPB"
$pwd = "PWD"
$domain = "Dom"
$ADDPConnect.Credential = New-Object system.Net.NetworkCredential -ArgumentList $userName,$pwd,$domain
# Create a searcher
$searchTargetOU = "dc=dom,dc=fr"
$searchFilter = "(samAccountName=user1)"
$searchScope = [System.DirectoryServices.Protocols.SearchScope]::Subtree
$searchAttrList = $null
foreach($user in "user1","user2","user3")
{
$searchFilter = "(samAccountName=$user)"
$searchRequest = New-Object System.DirectoryServices.Protocols.SearchRequest -ArgumentList $searchTargetOU,$searchFilter,$searchScope,$searchAttrList
$searchResponse = $ADDPConnect.SendRequest($searchRequest)
foreach($searchEntries in $searchResponse.Entries)
{
$searchEntries.DistinguishedName
}
}
If you start seeing timeout issues then set the timeout parameter appropriately like shown below
$ADDPConnect = New-Object System.DirectoryServices.Protocols.LdapConnection $serverName
$ADDPConnect.Timeout = "1000"
The below can help if you see timeout issues during execution
$ADDPConnect = New-Object System.DirectoryServices.Protocols.LdapConnection $serverName
$ADDPConnect.Timeout = "1000"