When are PowerShell data sections evaluated?
Specifically, are they only ever evaluated once at the point of runtime definition/loading? Or are they evaluated on every execution of the containing function, even if it has already been defined/loaded?
I'm assuming that the containing context is a function or advanced function that will be called multiple times in a single session after being defined/loaded, rather than a script file that would have to be reloaded on every invocation (as far as I understand, anyway).
Script to test for both questions:
(get-date).TimeOfDay.ToString()
Start-Sleep -Milliseconds 100
DATA dat -supportedCommand Get-Date {
get-date
}
Start-Sleep -Milliseconds 100
(get-date).TimeOfDay.ToString()
Start-Sleep -Milliseconds 100
$dat.TimeOfDay.ToString()
results (note that time from second line is the latest):
12:21:23.3191254
12:21:23.5393705
12:21:23.4306211
Which concludes that:
Data section evaluation is executed immediately, not delayed
Data section is evaluated only once, not on every usage
Data Sections would be much more useful if we had control over those mechanics. For example reading large text file only when needed or refreshing a variable upon every access.
Related
I am writing a autologin script in Powershell. With main purpose of doing autologon with keystrokes on remote clients in our environment after installation, with the desired AD and password entered.
Works fine on my i9. But most people using Tablets and Elitebooks so using
Thread Sleep
Works bad since i would need to have custom timing on Every hardware, or very high default numbers for lower end clients using my script
Is there any way adding an "wait for row above to completed" Before continuation to next.
I don't have enough on your current code to produce a more accurate answer but the idea, in all cases, remains the same.
You should periodically wake up the thread to check whether or not the machine is in the state you want it in and from there, you either go back to sleep or exit the loop and continue.
The delay is up to you but you want to find a sweet spot to have great performance and reactivity.
Example (based on your description)
$IsLoggedIn = $false
while (! $IsLoggedIn) {
$IsLoggedIn = 'Custom Logic returning $true if the user is logged in'
if ($IsLoggedIn) { break }
Start-Sleep -Milliseconds 100
}
You just need to figure out the thing you want to use as the check to validate the computer is in the correct state you need it in before proceeding further.
I have the model which I posted before on Stack. I am currently running the iterations through 5 Flow Chart blocks contain enter block and service block. when agent fill service block 5 in flow chart 5, the exit block should start to fill block one and so on. I have used While infinite loop to loop between the five flow chart blocks but it isn't working.
while(true)
{
for (Curing_Drying currProcess : collection) {
if (currProcess.allowedDay == (int)time(DAY)) {
currProcess.enter.take(agent);
}
}
if (queue10.size() <= Throughtput1){
break;
}
}
Image for further illustration 1
Image for further illustration 2
Wondering if someone can tell me what is wrong in the code.
Based on the description and the pictures provided, it isn't clear why the while loop is necessary. The On exit action is executed for each Agent arrival to the Exit block. It seems that the intention is to find the appropriate Curing_Drying block based on number of days since the model start time? If so, then just iterating through the collection is enough.
Also, it is generally a good practice to provide more meaningful names to collections. Using simply collection doesn't say anything about the contents and can get pretty confusing later on.
I expected the following code to complete in about one second.
It executes in about 20 seconds:
$i = 0; do{sleep -Milliseconds 1; $i=$i+1}while($i -lt 1000)
Could you please suggest why? I'm not able to find any clues in docs.
Thanks in advance!
Calling a cmdlet comes at a cost. Just because you use Start-Sleep -Milliseconds 1, doesn't mean it's going to take 1ms. This is because that cmdlet has overhead it needs to take care of behind the scenes, like setting up the timer, instantiating objects, etc.
Measure-Command { Start-Sleep -Milliseconds 1 }
# TotalMilliseconds : 25.1157
See the above...even though I told it to only run for 1ms, it still took 25ms because of the overhead. This overhead won't be exactly the same every time, but you should always expect there to be some.
On my computer, it seems to average about 16ms of overhead per call. So if you run that 1000 times, then on average, it's going to take 16 seconds to run, just for the sleep alone.
I obtained the average by running this a few times:
Measure-Command { 1..100 | % { Start-Sleep -Milliseconds 1 } }
It's like driving a car. You don't just hop in a car and go, you need to start it up first, and there's things going on behind the scenes the car needs to do in order to start. And that takes a little bit of time.
I have a file, where I keep stored a serialized perl hash. In my current script, I load the values like this:
my $arrayref = retrieve("mySerializedFile");
my $a = $arrayref->[0];
my $b = $arrayref->[1];
my $c = $arrayref->[2];
My problem is that the file is about a 1GB so it takes about ten secs to load, and then a second more to perform some operations. I would like to reduce the retrieve time.
Is there any way of having this info loaded before the script execution? I mean, mySerialiedFile is not suposed to be changed in a long time, so if I could have it loaded always on the system would be nice, and would improve my execution time from 11secs to 1.
Following the suggestions in the comments, I used a db engine, which improved A LOT the execution time, which is about 5secs now.
I have a strange problem with my script in powershell, I want to examine the average time of downloading page. I write script which fires frequently. But sometimes my script returns result 0, which means it downloads site in 0 ms. If i modified my script to save whole site to the file when the download time is about 0ms it doesn't saves anything. And I'm interesting if I do something wrong, or powershell function isn't too accurate to count such "small" times.
ps. other "good" results are about 4-9 ms.
Here is a part of my script which responds to count the download time:
$StartTime = Get-Date
$PageDownload = $Request.DownloadString("mypage.com")
$TimeTaken = ((Get-Date) - $StartTime).TotalMilliseconds
Get-Date should be as precise as the system clock is.
There could be web caching going on. Unfortunately, disabling caching for WebClient is not possible, from what I see elsewhere. The "do it right" method is to construct your own Http request with the TcpClient class, but that's also pretty complex.
One easy way to make sure you're not being cached is to put an arbitrary value as a GET request. It's a hack, but it is often enough to fool a cache. So, instead of:
"http://mypage.com"
You use:
"http://mypage.com?someUnusedValueName=$([System.Environment]::TickCount)"