Active Directory transient error when bulk using ActiveDirectory cmdlet - powershell

I have a system which synchronizes users from a data source. The data source consists of user information. When a new user is synchronized, a PowerShell task is triggered which creates or updates the user. This is all fine but when the amount of new/updated users becomes too big, some of the tasks fail with some interesting errors such as:
"The server has returned the following error: invalid enumeration
context."
or
"A connection to the directory on which to process the request was
unavailable. This is likely a transient condition."
When troubleshooting it seems obvious that the reason these errors occur are a lack of resources. It is because all the simultaneously triggered tasks are importing the module on their own PS session.
So I went trying some different things and measuring Import-Module speed etc. So I've concluded that it is faster to run Import-Module and then Get-ADUser for instance, then just Get-ADUser (which would also import the module).
Measure-Command {Import-Module ActiveDirectory}
Average time 340 ms
Measure-Command {Get-ADUser -Filter *}
Average time 420 ms
Get-ADUser after the module is imported
Average time 10 ms
But these marginal differences are not going to do anything to the issue. So I had to look further. I find that disabling the drive might help speed up the process, so I've added the following before importing the module:
$Env:ADPS_LoadDefaultDrive = 0
Measure-Command {Import-Module ActiveDirectory}
Average time 85 ms
4 times faster! But still the error persists at high amounts of users at the same time (e.g. 50 tasks). So I thought about polling the availability in the script, or make a do..while loop. Or maybe the system which fires the separate tasks needs to be redesigned, to have some sort of queue.
Does anyone recognize this situation? Or have some thoughts they'd like to share on this subject?

It is because all the simultaneously triggered tasks are importing the module on their own PS session.
Then you need to make sure this doesn't happen, or at least not so much that you run out of resources. So you have two options:
Limit the number of tasks that get run at any one time (maybe 5 at a time).
Make it one task that can work on several accounts. This way the module only gets loaded once.
I think option 2 is the better solution. For example, rather than triggering the script right away, your synchronization job could just write the username to a file (or in memory even) and once it's done finding all the users, it triggers the PowerShell script and passes the list (or the script could read the file that was written to). You have options there - whatever works best.
Update: All of .NET is available in PowerShell, so another option is to change the whole script to use .NET's DirectoryEntry instead of the ActiveDirectory module, which would use much less memory. There are even shortcuts for it in PowerShell. For example, [ADSI]"LDAP://$distinguishedName" would create a DirectoryEntry object for a user. It is a substantial rewrite, but the performance is much better (speed and memory consumption).
Here are some examples of searching with DirectorySearcher (using the [ADSISearcher] shortcut): https://blogs.technet.microsoft.com/heyscriptingguy/2010/08/23/use-the-directorysearcher-net-class-and-powershell-to-search-active-directory/
And here's an example of creating accounts with DirectoryEntry: https://www.petri.com/creating-active-directory-user-accounts-adsi-powershell

Related

Error Get-ADUser: Invalid enumeration context

I currently need to export the groups of all users in my domain and I get the error Error with Get-ADUser: Invalid enumeration context. In the script I indicate an ou that has 1300 users and it only brings me approximately 900. how could i solve it? I attach the script.
I attach the script
You are hitting the 2 minute time limit. I do not believe there is a way to adjust the timeout, but you can filter:
Get-ADObject -Filter 'CN -like "*a*"'
Repeat a - z
That will give you everyone. You could probably come up with a more creative filter, but this is the answer.
https://learn.microsoft.com/en-us/previous-versions/windows/server/hh531527(v=ws.10)?redirectedfrom=MSDN
From Microsoft:
Timeout Behavior
The following statements specify timeout conditions within the Active
Directory module and describe what can be done about a timeout them.
The default Active Directory module timeout for all operations is 2
minutes.
For search operation, the Active Directory module uses paging control
with a 2-minute timeout for each page search.
Note: Because a search may involve multiple server page requests the
overall search time may exceed 2 minutes.
A TimeoutException error indicates that a timeout has occurred.
For a search operation, you can choose to use a smaller page size, set
with the ResultPageSize parameter, if you are getting a TimeoutException
error.
If after trying these changes you are still getting a TimeoutException
error, consider optimizing your filter using the guidance in the
Optimizing Filters section of this topic.

Powershell launch at specific times

I am building a system to restart computers for patch purposes. Most of the skeleton is there and working, I use workflows and some functions to allow me to capture errors and reboot the systems in a number of ways in case of failures.
One thing I am not sure of is how to set up the timing. I am working on a web interface where people can schedule their reboots, either dynamic (one-time) or regularly scheduled (monthly). The server info and times for the boots is stored in a SQL database.
The part that I am missing is how to trigger the reboots when scheduled. All I can think of is allowing for whole hour increments, and run a script hourly checking to see if any servers in the db have a reboot time that "matches" the current time. This will likely work, but is somewhat inflexible.
Can anyone think of a better way? Some sort of daemon?
For instance, user X has 300 servers assigned to him. He wants 200 rebooted at 10 PM on each Friday, and 50 once a month on Saturday at 11 PM. There will be over a dozen users rebooting 3000-4000 computers, sometimes multiple times monthly.
Ok, let's say you have a script that takes a date and time as an argument that will look up what computers to reboot based on that specific date and time, or schedule, or whatever it is that you're storing in your sql db that specifies how often to reboot things. For the sake of me not really knowing sql that well, we'll pretend that this is functional (would require the PowerShell Community Extensions snapin for Invoke-AdoCommand cmdlet):
[cmdletbinding()]
Param([string]$RebootTime)
$ConStr = 'Data Source=SQLSrv01;Database=RebootTracking;Integrated Security=true;'
$Query = "Select * from Table1 Where Schedule = $RebootTime"
$Data = Invoke-AdoCommand -ProviderName SqlClient -ConnectionString $ConStr -CommandText $Query
$data | ForEach{Do things to shutdown $_.ServerName}
You said you already have things setup to reboot the servers so I didn't even really try there. Then all you have to do is setup a scheduledjob for whenever any server is supposed to be rebooted:
Register-ScheduledJob –FilePath \\Srv01\Scripts\RebootServers.ps1 –Trigger #{Frequency=Weekly; At="10:00PM"; DaysOfWeek="Saturday"; Interval=4} -ArgumentList #{'RebootTime'="Day=Saturday;Weeks=4;Time=22:00"}
That's an example, but you know, you could probably work with it to accomplish your needs. That will run it once every 4 Saturdays (about monthly). That one scheduled task will query the sql server to match up a string against a field, and really you could format that any way you want to make it as specific or general as desired for the match. That way one task could reboot those 200 servers, and another could reboot the other 50 all dependent on the user's desires.

managing instances of powerCLI script

I wrote a powerCLI script that can automatically deploy a new VM with some given parameters.
In few words, the script connects to a given VC and start the deployment from an existing template.
Can I regulate the number of instances of my script that will run on the same computer ?
Can I regulate the number of instances of my script that will run on different computers but when both instances will be connected to the same VC ?
To resolve the issue i thought of developing a server side appilcation where each instance of my script will connect to, and the server will then handle all the instances , but i am not sure if such thing is possible in powerCLI/Powershell.
Virtually anything is poshable, or so they say. What you're describing may be overkill, however, depending on your scenario. Multiple instances of the same script will each run in its own Powershell process. Virtual Center allows hundreds of simultaneous connections. Of course the content or context of your script might dictate that it shouldn't run in simultaneous instances. I haven't experimented, but it seems like there are ways to determine the name of running Powershell scripts. So if you keep the script name consistent on each computer, you could probably build in some checks along the lines of the linked answer.
But depending on your particulars, it might be easier to go a different way. For example, if you don't want the script to run simultaneously because you have hard-coded the name of a new-osCustomizationSpec, for example, a simple\klugey solution might be to do a check for that new spec, and disconnect/exit/rollback if it exists. A better solution might be to give the new spec a unique name. But the devil is in the details. Hope that helps a bit.

Discrepancy in Date Created attribute between Powershell script output and Windows Explorer

I wrote a simple powershell script that recursively walks a file tree and returns the paths of each node along with the time of its creation in tab-separated form, so that I can write it out to a text file and use it to do statistical analysis:
echo "PATH CREATEDATE"
get-childitem -recurse | foreach-object {
$filepath = $_.FullName
$datecreated = $_.CreationTime
echo "$filepath $datecreated"
}
Once I had done this, however, I noticed that the CreationDate times that get produced by the script are exactly one hour ahead of what Windows Explorer says when I look at the same attribute of the same files. Based on inspecting the rest of my dataset (which recorded surrounding events in a different format), it's clear that the results I get from explorer are the only ones that fit the overall narrative, which leads me to believe that there's something wrong with the Powershell script that makes it write out the incorrect time. Does anyone have a sense for why that might be?
Problem background:
I'm trying to correct for a problem in the design of some XML log files, which logged when the users started and stopped using an application when it was actually supposed to log how long it took the users to get through different stages of the workflow. I found a possible way to overcome this problem, by pulling date information from some backup files that the users sent along with the XML logs. The backups are generated by our end-user application at the exact moment when a user transitions between stages in the workflow, so I'm trying to bring information from those files' timestamps together with the contents of the original XML log to figure out what I wanted to know about the workflow steps.
Summary of points that have come out in comment discussion:
The files are located on the same machine as the script I'm running (not a network store)
Correcting for daylight savings and time zones has improved the data quality, but not for the specific issue posed in the original question.
I never found the ultimate technical reason for the discrepancy between the timestamps from powershell vs. explorer, but I was able to correct for it by just subtracting an hour off all the timestamps I got from the powershell script. After doing that, however, there was still a large amount of disagreement between the time stamps I got from out of my XML log files and the ones I pulled from the filesystem using the powershell script. Reasoning that the end-users probably stayed in the same time zone when they were generating the files, I wrote a little algorithm to estimate the time zone of each user by evaluating the median amount of time between steps 1 and 2 in the workflow and steps 2 and 3. If there was a problem with the user's time zone, one of those two timespans would be negative (since the time of the step 2 event was estimated and the times of the steps 1 and 3 events were known from the XML logs.) I then rounded the positive value down to the nearest hour and applied that number of hours as an offset to that user's step 2 times. Overall, this took the amount of bad data in my dataset from 20% down to 0.01%, so I'm happy with the results.
In case anyone needs it, here's the code I used to make the hour offset in the timestamps (not powershell code, this was in a C# script that handled another part of data processing):
DateTime step2time = DateTime.Parse(LastModifyDate);
TimeSpan shenanigansCorrection = new TimeSpan(step2time.Hour-1,step2time.Minute,step2time.Second);
step2time= step2time.Date + shenanigansCorrection;
The reason for redefining the step2time variable is that DateTimes aren't mutable in .NET.

Remotely running Get-EventLog wipes actual event log

I wrote a script to remotely fetch event logs in PowerShell, but I'm not a fan of the way the script makes its own event log entries.
Specifically this script grabs several types of event IDs including the logon/logoff events. Recently I was asked to run the script to fetch events for another user, and had to have this data fetched in a few hours. Normally I start the script and let it run for the majority of the day (because there is usually a LOT of data here), but this time to speed up the process, I spun up 4 instances of the script to fetch this data faster than usual. Each instance of the script was looking at a different time frame so that all 4 scripts combined were fetching in the time frame I had been asked for.
Over the course of 3 hours or so, I had over a million different login attempts for my user ID on this remote computer. I had so many logins that I ended up overwriting the event log data I was originally asked to fetch.
Lessons learned, I'm now researching how to make this faster, more efficient, and more reliable.
Here's the heart of my code, pretty plain and simple, and works for the most part.
$output = Get-EventLog `
-instanceID 4672,4647,4634,4624,4625,4648,4675,4800,4801,4802,4803 `
-logName Security `
-after (Get-Date $inStartDate) `
-before (Get-Date $inEndDate).addDays(1) `
-message ("*" + $InUserID + "*") `
-computerName $inPCID
I guess there are several questions that I haven't been able to figure out in my research thus far. Why would Get-EventLog need to make so many connections? Is it because the connection kept dropping or something?
What would be the fastest way to fetch this data - Using the native Get-EventLog command by specifying a -ComputerName, or should I be using something like Enter-PSSession or Invoke-Command.
Will Enter-PSSession and Invoke-Command both have the same problem I'm having with Get-EventLog?
I'd like to avoid using Enter-PSSession and Invoke-Command for the simple fact that I can't guarantee all machines in the company will have remote-execution enabled.
So, the problem was that Get-EventLog was ultimately wiping the remote event logs I was trying to fetch. Not sure why, but Get-EventLog made over a million "connections" that appeared as logon/logoff events, thus overwriting my logs.
Per #Richard's comment, I did a bit of research, and decided to test using Get-WinEvent, the updated and more powerful replacement for Get-EventLog. After testing around with this for a week, the worst case scenario that I ran into was my script creating 4 logon/logoff events. This is completely acceptable and much nicer than over a million events.
A side question I had related to this topic was performance. Because we're gathering many remote logs, we sometimes need this done as fast as possible. My question is whether or not Get-WinEvent would be fast enough to pull logs, when compared to an Enter-PSSession or an Invoke-Command.
For the scope of this question, Get-WinEvent satisfied both the speed requirements as well as the event requirements and relying on the cmdlet to perform my remoting worked great.
My code is pretty simple, but I'll post it for record reasons:
Get-WinEvent `
-computerName $inPCID `
-FilterHashtable #{LogName="Security";Id=4672,4647,4634,4624,4625,4648,4675,4800,4801,4802,4803; `
StartTime = $inStartDate; EndTime = $inEndDate} | Where-object {$_.message -like ("*" + $InUserID + "*")} `
| export-csv $fullOutputFile