Calculating Private Working set memory in perl - perl

I'm basically trying to query private working set of a process in perl.
I have already refereed this post.
The solution works great on win8/8.1 but for some reason the following line returns nothing on win7x64 and I have tried using IDProcess instead of Name, it still returns no process.
my $proc = $objWMI->ExecQuery("select * from Win32_Process where Name=\'notepad\'");
$objWMI = Win32::OLE->GetObject('winmgmts:\\\\.\\root\\cimv2');
$proc= $objWMI->ExecQuery("select * from Win32_PerfRawData_PerfProc_Process where Name=\'notepad\'");
foreach my $process (in($proc))
{
print "abc";
$out = $process->{WorkingSetPrivate};
}
So this thing doesn't work.
Is there any different way of querying private working set size of a process in perl?

It seems like on windows 7 64 bit the performance counter was corrupted. I finally founded this post Corrupt Performance Counter
So I started cmd as admin and did lodctr /R (This basically reset your performance counters). After this, I was finally able to get the process, and this following code worked flawless :)
my $proc = $objWMI->ExecQuery("select * from Win32_PerfRawData_PerfProc_Process where Name=\'notepad\'");

Related

Powershell StreamReader - how to wait for a new file to be readable

My script generally assumes the existence of a *.txt file with settings to help it function better. However, if the script doesn't exist, it creates a local file to hold these settings. I realise there's no logical need to then read this file, but I'd like to understand why I can't.
[void][System.IO.File]::Create($PSFileName)
$ReadPS = New-Object System.IO.StreamReader($PSFileName)
Immediately after the script may (rarely) create the file, it attempts to read it, which generates the following error: New-Object : Exception calling ".ctor" with "1" argument(s): "The process cannot access the file 'C:\Temp\MyFile.txt' because it is being used by another process."
So I have to wait for the file to be available, right? Yet a simple start-sleep for 5s doesn't work. But if I wrap it in a loop with a try-catch, it works within a fraction of a second every time:
[void][System.IO.File]::Create($PSFileName)
$RCount = 0 # if new file created, sometimes it takes a while for the lock to be released.
Do{
try{
$ReadPS = New-Object System.IO.StreamReader($PSFileName)
$RCount+=100
}catch{ # if error encountered whilst setting up StreamReader, try again up to 100 times.
$RCount++
Start-Sleep -Milliseconds 1 # Wait long enough for the process to complete. 50ms seems to be the sweet spot for fewest loops at fastest performance
}
}Until($RCount -ge 100)
$ReadPS.Close()
$ReadPS.Dispose()
This is overly convoluted. Why does the file stay locked for an arbitrary length of time that seems to increase the more I wait for it? Is there anything I can adjust or add between the file creation and the StreamReader to ensure the file is available?
As it was already mentioned in the comments, the method you are using does create a lock on the file, which stays until you call the close / dispose method or the powershell session end.
That's why the more you wait for it, the longer your session stays open and the longer the lock on the file is maintained.
I'd recommend you to just use New-Item instead which is the Powershell native way to do it.
Since you are creating a StreamReader object though, don't forget to close / dispose the object once you are over.
New-Item -Path $PSFileName -ItemType File
$ReadPS = New-Object System.IO.StreamReader($PSFileName)
#Stuff
$ReadPS.Close()
$ReadPS.Dispose()
Finally, if for some reason you still wanted to use [System.IO.File]::Create($PSFileName), you will also need to call the close method to free the lock.
You have simply to close the file handle. Try:
$fh = [System.IO.File]::Create($PSFileName)
[void]$fh.Close()
[void]$fh.Dispose()
$ReadPS = New-Object System.IO.StreamReader($PSFileName)
The Create method returns a FileStream object. Since StreamReader is derived from Stream, My Solution was to recast as astream reader. Almost a one-liner...:
$PSFileName = 'c:\temp\testfile.txt'
$Stream = [System.IO.StreamReader][System.IO.File]::Create($PSFileName)
OR, Suggestion From Jeroen Mostert :
$PSFileName = 'c:\temp\testfile.txt'
$Stream = [System.IO.StreamReader]::New( [System.IO.File]::Create($PSFileName) )
You don't have to worry about Garbage Collection with this approach because the resulting object is referenced to the variable...
Honestly I'm not too sure about this, I believe the FileStream object can be leveraged directly to read & write, but I'm less familiar than I am with StreamReader & Writer objects, so if it were me I'd do the re-cast so I can move on, but research further later.
Also, if you use another approach I would use .CLose() instead of .Dispose(). My understanding based on the .Net documentation is close is more thorough, and calls Dispose internally anyhow...

loading a serialized variable in perl

I have a file, where I keep stored a serialized perl hash. In my current script, I load the values like this:
my $arrayref = retrieve("mySerializedFile");
my $a = $arrayref->[0];
my $b = $arrayref->[1];
my $c = $arrayref->[2];
My problem is that the file is about a 1GB so it takes about ten secs to load, and then a second more to perform some operations. I would like to reduce the retrieve time.
Is there any way of having this info loaded before the script execution? I mean, mySerialiedFile is not suposed to be changed in a long time, so if I could have it loaded always on the system would be nice, and would improve my execution time from 11secs to 1.
Following the suggestions in the comments, I used a db engine, which improved A LOT the execution time, which is about 5secs now.

Powershell download page in 0 milliseconds

I have a strange problem with my script in powershell, I want to examine the average time of downloading page. I write script which fires frequently. But sometimes my script returns result 0, which means it downloads site in 0 ms. If i modified my script to save whole site to the file when the download time is about 0ms it doesn't saves anything. And I'm interesting if I do something wrong, or powershell function isn't too accurate to count such "small" times.
ps. other "good" results are about 4-9 ms.
Here is a part of my script which responds to count the download time:
$StartTime = Get-Date
$PageDownload = $Request.DownloadString("mypage.com")
$TimeTaken = ((Get-Date) - $StartTime).TotalMilliseconds
Get-Date should be as precise as the system clock is.
There could be web caching going on. Unfortunately, disabling caching for WebClient is not possible, from what I see elsewhere. The "do it right" method is to construct your own Http request with the TcpClient class, but that's also pretty complex.
One easy way to make sure you're not being cached is to put an arbitrary value as a GET request. It's a hack, but it is often enough to fool a cache. So, instead of:
"http://mypage.com"
You use:
"http://mypage.com?someUnusedValueName=$([System.Environment]::TickCount)"

How can I validate an image file in Perl?

How would I validate that a jpg file is a valid image file. We are having files written to a directory using FTP, but we seem to be picking up the file before it has finished writing it, creating invalid images. I need to be able to identify when it is no longer being written to. Any ideas?
Easiest way might just be to write the file to a temporary directory and then move it to the real directory after the write is finished.
Or you could check here.
JPEG::Error
[arguments: none] If the file reference remains undefined after a call to new, the file is to be considered not parseable by this module, and one should issue some error message and go to another file. An error message explaining the reason of the failure can be retrieved with the Error method:
EDIT:
Image::TestJPG might be even better.
You're solving the wrong problem, I think.
What you should be doing is figuring out how to tell when whatever FTPd you're using is done writing the file - that way when you come to have the same problem for (say) GIFs, DOCs or MPEGs, you don't have to fix it again.
Precisely how you do that depends rather crucially on what FTPd on what OS you're running. Some do, I believe, have hooks you can set to trigger when an upload's done.
If you can run your own FTPd, Net::FTPServer or POE::Component::Server::FTP are customizable to do the right thing.
In the absence of that:
1) try tailing the logs with a Perl script that looks for 'upload complete' messages
2) use something like lsof or fuser to check whether anything is locking a file before you try and copy it.
Again looking at the FTP issue rather than the JPG issue.
I check the timestamp on the file to make sure it hasn't been modified in the last X (5) mins - that way I can be reasonably sure they've finished uploading
# time in seconds that the file was last modified
my $last_modified = (stat("$path/$file"))[9];
# get the time in secs since epoch (ie 1970)
my $epoch_time = time();
# ensure file's not been modified during the last 5 mins, ie still uploading
unless ( $last_modified >= ($epoch_time - 300)) {
# move / edit or what ever
}
I had something similar come up once, more or less what I did was:
var oldImageSize = 0;
var currentImageSize;
while((currentImageSize = checkImageSize(imageFile)) != oldImageSize){
oldImageSize = currentImageSize;
sleep 10;
}
processImage(imageFile);
Have the FTP process set the readonly flag, then only work with files that have the readonly flag set.

Creating batch jobs in PowerShell

Imagine a DOS style .cmd file which is used to launch interdependent windowed applications in the right order.
Example:
1) Launch a server application by calling an exe with parameters.
2) Wait for the server to become initialized (or a fixed amount of time).
3) Launch client application by calling an exe with parameters.
What is the simplest way of accomplishing this kind of batch job in PowerShell?
Remember that PowerShell can access .Net objects. The Start-Sleep as suggested by Blair Conrad can be replaced by a call to WaitForInputIdle of the server process so you know when the server is ready before starting the client.
$sp = get-process server-application
$sp.WaitForInputIdle()
You could also use Process.Start to start the process and have it return the exact Process. Then you don't need the get-process.
$sp = [diagnostics.process]::start("server-application", "params")
$sp.WaitForInputIdle()
$cp = [diagnostics.process]::start("client-application", "params")
#Lars Truijens suggested
Remember that PowerShell can access
.Net objects. The Start-Sleep as
suggested by Blair Conrad can be
replaced by a call to WaitForInputIdle
of the server process so you know when
the server is ready before starting
the client.
This is more elegant than sleeping for a fixed (or supplied via parameter) amount of time. However,
WaitForInputIdle
applies only to processes with a user
interface and, therefore, a message
loop.
so this may not work, depending on the characteristics of launch-server-application. However, as Lars pointed out to me, the question referred to a windowed application (which I missed when I read the question), so his solution is probably best.
To wait 10 seconds between launching the applications, try
launch-server-application serverparam1 serverparam2 ...
Start-Sleep -s 10
launch-client-application clientparam1 clientparam2 clientparam3 ...
If you want to create a script and have the arguments passed in, create a file called runlinkedapps.ps1 (or whatever) with these contents:
launch-server-application $args[0] $args[1]
Start-Sleep -s 10
launch-client-application $args[2] $args[3] $args[4]
Or however you choose to distribute the server and client parameters on the line you use to run runlinkedapps.ps1. If you want, you could even pass in the delay here, instead of hardcoding 10.
Remember, your .ps1 file need to be on your Path, or you'll have to specify its location when you run it. (Oh, and I've assumed that launch-server-application and launch-client-application are on your Path - if not, you'll need to specify the full path to them as well.)