StreamWriter in Powershell Outputs Blank File - powershell

Here is the code:
$outputPath = "c:\Output"
$scopePath = "$($outputpath)\scopes.csv"
$clientsPath = "$($outputpath)\clients.csv"
$scopeStream = New-Object System.IO.StreamWriter($scopePath)
$clientStream = New-Object System.IO.StreamWriter($clientsPath)
$scopeStream.WriteLine("Scope,ScopeName")
$clientStream.WriteLine("IP,MAC,Lease,Reservation,Hostname")
$scopeStream.Flush()
$clientStream.Flush()
...
$scopeStream.WriteLine("$($scope.Name),$($scope.Address)")
...
$clientStream.WriteLine("$($client.IP),$($client.MAC),$($client.Lease),$($client.Reservation)$($client.Hostname)")
...
$scopeStream.close()
$scopeStream.dispose()
$clientStream.close()
$clientStream.dispose()
The files are created if they don't exist, but nothing is ever written to them. Perfectly blank files, and I can't figure out why, for the life of me.

I've seen this before - usually it's because the StreamWriter is throwing an Exception. You'd think that this would cause an exception in your script, but I've seen cases where it's throwing non-terminating exceptions, so your script doesn't ever receive it.
After running the block of code, try getting the last exception details:
$Error[0].Exception | Format-List * -Force
If there's an exception, it should help you figure out exactly what is causing the problem. My guess would be that it's a permissions issue.

Related

Better custom error handling for powershell

So I have a powershell script that integrates with several other external third-party EXE utilities. Each one returns its own kind of errors as well as some return non-error related output to stderr (yes badly designed I know, I didn't write these utilities). So What I'm currently doing is parsing the output of each utility and doing some keyword matching. This approach does work but I feel that as I use these scripts and utilties I'll have to add more exceptions to what the error actually is. So I need to create something that is expandable,possibly a kind of structure I can add to an external file like a module.
I was thinking of leveraging the features of a custom PSObject to get this done but I am struggling with the details. Currently my parsing routine for each utility is:
foreach($errtype in {'error','fail','exception'})
{
if($JobOut -match $errtype){ $Status = 'Failure' }
else if($JobOut -match 'Warning'){$Status = 'Warning' }
else { $Status = 'Success' }
}
So this looks pretty straightforward until I run into some utility that contain some of the keywords in $errtype within $JobOut that is not an error. So now I have to add some exceptions to the logic:
foreach($errtype in {'error','fail','exception'})
{
if($JobOut -match 'error' -and(-not($JobOut -match 'Error Log' }
elseif($JobOut -match $errtype){ $Status = 'Failure' }
else if($JobOut -match 'Warning'){$Status = 'Warning' }
else { $Status = 'Success' }
}
So as you can see this method has the potential to get out of control quickly and I would rather not start editing core code to add a new error rule every time I come across a new error.
Is there a way to maybe create a structure of errors for each utility that contains the logic for what is an error. Something that would be easy to add new rules too?
Any help with this is really appreciated.
I would think a switch would do nicely here.
It's very basic, but can be modified very easily and is highly expandable and I like that you can have an action based on the input to the switch, which could be used for logging or remediation.
Create a function that allows you to easily provide input to the switch and then maintain that function with all your error codes, or words, etc. then simply use the function where required.
TechNet Tips on Switches
TechNet Tips on Functions

Handeling non-stopping errors for Get-ChildItem in PowerShell

I'm currently writing a script which should be used to get the average access/list time for a directory tree on a CIFS share. To do this I'm using the following code (as a snippet):
$time = Measure-Command {
try{
$subitems = Get-ChildItem $directory
}catch{
$msg = "Error accessing "+$dir+": "+$_.Exception.Message
}
}
That piece of code is working fine and does get me the information I want. But one issue I'm facing is that there are non stopping errors for Get-ChildItem which are not caught by the catch (as they are non stopping). To prevent this I could add -ErrorAction Stop to Get-ChildItem but if I do that I won't be able to get a listing for the directory that has even one item that throws an error.
Examples of this include missing permissions and paths exceeding 260 characters (for whatever reason that is still a thing). I really would like to get that information in some way to do some further handling/reporting on it. Would anyone know how to catch those/react to those?
My research so far always suggests to use -ErrorAcction Stop which would "discard" any information for $subitems that I could use.
So you want to catch the error and the script to continue,I have modified your code to redirect error output and then check previous command's exit status to check whether any error occured.
Is this what you are looking for?
$time = Measure-Command {
try{
$subitems = Get-ChildItem $directory 2> $outnull
if(-not $?){
#whatever action you want to perform
$msg = $msg + "Error accessing "+$dir+": "+$error[0].Exception.Message
}
}catch{
$msg = "Error accessing "+$dir+": "+$_.Exception.Message
}
}
I am concatenating $msg in the block with itself ,so that no msg will be lost by overwriting

Using querySelectorAll on an mshtml.HTMLDocumentClass object in PowerShell causes a crash

I'm trying to do some web-scraping via PowerShell, as I've recently discovered it is possible to do so without too much trouble.
A good starting point is to just fetch the HTML, use Get-Member, and see what I can do from there, like so:
$html = Invoke-WebRequest "https://www.google.com"
$html.ParsedHtml | Get-Member
The methods available to me for fetching specific elements appear to be the following:
getElementById()
getElementsByName()
getElementsByTagName()
For example I can get the first IMG tag in the document like so:
$html.ParsedHtml.getElementsByTagName("img")[0]
However after doing some more research in to whether I could use CSS Selectors or XPath I discovered that there are unlisted methods available, since we are just using the HTML Document object documented here:
querySelector()
querySelectorAll()
So instead of doing:
$html.ParsedHtml.getElementsByTagName("img")[0]
I can do:
$html.ParsedHtml.querySelector("img")
So I was expecting to be able to do:
$html.ParsedHtml.querySelectorAll("img")
...in order to get all of the IMG elements. All the documentation I've found and googling I've done supports this. However, in all my testing this function crashes the calling process and reports a heap corruption exception code in the Event Log (0xc0000374).
I'm using PowerShell 5 on Windows 10 x64. I've tried it in a Win10 x64 VM that is a clean build and just patched up. I've also tried it in Win7 x64 upgraded to PowerShell 5. I haven't tried it on anything prior to PowerShell 5 as all our systems here are upgraded, but I probably will once I have time to spool a new vanilla VM for testing.
Has anyone run in to this issue before? All my research so far is a dead end. Are there alternatives to querySelectorAll? I need to scrape pages that will have predictable sets of tags inside unpredictable layouts and potentially no IDs or classes assigned to the tags, so I want to be able to use selectors that allow structure/nesting/wildcards.
P.S. I've also tried using the InternetExplorer.Application COM object in PowerShell, the result is the same, except instead of PowerShell crashing Internet Explorer crashes. This was actually my original approach, here's the code:
# create browser object
$ie = New-Object -ComObject InternetExplorer.Application
# make browser visible for debugging, otherwise this isn't necessary for function
$ie.Visible = $true
# browse to page
$ie.Navigate("https://www.google.com")
# wait till browser is not busy
Do { Start-Sleep -m 100 } Until (!$ie.Busy)
# this works
$ie.document.getElementsByTagName("img")[0]
# this works as well
$ie.document.querySelector("img")
# blow it up
$ie.document.querySelectorAll("img")
# we wanna quit the process, but since we blew it up we don't really make it here
$ie.Quit()
Hope I'm not breaking any rules and this post makes sense and is relevant, thanks.
UPDATE
I tested earlier PowerShell versions. v2-v4 crash using the InternetExplorer.Application COM method. v3-4 crash using the Invoke-WebRequest method, v2 doesn't support it.
I ran into this problem, too, and posted about it on reddit. I believe the problem happens when Powershell tries to enumerate the HTML DOM NodeList object returned by querySelectorAll(). The same object is returned by childNodes() which can be enumerated by PS, so I'm guessing there's some glue code written for .ParsedHtml.childNodes but not .ParsedHtml.querySelectorAll(). The crash can be triggered by Intellisense trying to get tab-complete help for the object, too.
I found a way around it, though! Just access the native DOM methods .item() and .length directly and emit the node objects into a PowerShell array. The following code pulls the newest page of posts from /r/Powershell, gets the post list anchors via querySelectorAll() then manually enumerates them using the native DOM methods into a Powershell-native array.
$Result = Invoke-WebRequest -Uri "https://www.reddit.com/r/PowerShell/new/"
$NodeList = $Result.ParsedHtml.querySelectorAll("#siteTable div div p.title a")
$PsNodeList = #()
for ($i = 0; $i -lt $NodeList.Length; $i++) {
$PsNodeList += $NodeList.item($i)
}
$PsNodeList | ForEach-Object {
$_.InnerHtml
}
Edit .Length seems to work capitalized or lower-case. I would have expected the DOM to be case-sensitive, so either there's some things going on to help translate or I'm misunderstanding something. Also, the CSS selector is grabbing the source links (self.PowerShell mostly), but that it my CSS selector logic error, not a problem with querySelectorAll(). Note that the results of querySelectorAll() are not live, so modifying them won't modify the original DOM. And I haven't tried modifying them or using their methods yet, but clearly we can grab at the very least .InnerHtml.
Edit 2: Here is a more-generalized wrapper function:
function Get-FixedQuerySelectorAll {
param (
$HtmlWro,
$CssSelector
)
# After assignment, $NodeList will crash powershell if enumerated in any way including Intellisense-completion while coding!
$NodeList = $HtmlWro.ParsedHtml.querySelectorAll($CssSelector)
for ($i = 0; $i -lt $NodeList.length; $i++) {
Write-Output $NodeList.item($i)
}
}
$HtmlWro is an HTML Web Response Object, the output of Invoke-WebReqest. I originally tried to pass .ParsedHtml but then it would crash on assignment. Doing it this way returns the nodes in a Powershell array.
The #midnightfreddie's solution worked fine for me before, but now it throws Exception from HRESULT: 0x80020101 when calling $NodeList.item($i).
I found the following workaround:
function Invoke-QuerySelectorAll($node, [string] $selector)
{
$nodeList = $node.querySelectorAll($selector)
$nodeListType = $nodeList.GetType()
$result = #()
for ($i = 0; $i -lt $nodeList.length; $i++)
{
$result += $nodeListType.InvokeMember("item", [System.Reflection.BindingFlags]::InvokeMethod, $null, $nodeList, $i)
}
return $result
}
This one works for New-Object -ComObject InternetExplorer.Application as well.

Getting result of .Net object asynchronous method in powershell

I'm trying to call an async method on a .Net object instantiated in Powershell :
Add-Type -Path 'my.dll'
$myobj = new-object mynamespace.MyObj()
$res = $myobj.MyAsyncMethod("arg").Result
Write-Host "Result : " $res
When executing the script, the shell doesn't seem to wait for MyAsyncMethod().Result and displays nothing, although inspecting the return value indicates it is the correct type (Task<T>). Various other attempts, such as intermediary variables, Wait(), etc. gave no results.
Most of the stuff I found on the web is about asynchronously calling a Powershell script from C#. I want the reverse, but nobody seems to be interested in doing that. Is that even possible and if not, why ?
I know this is a very old thread, but it might be that you were actually getting an error from the async method but it was being swallowed because you were using .Result.
Try using .GetAwaiter().GetResult() instead of .Result and that will cause any exceptions to be bubbled up.
For long running methods, use the PSRunspacedDelegate module, which will enable you to run the task asynchronously:
$task = $myobj.MyAsyncMethod("arg");
$continuation = New-RunspacedDelegate ( [Action[System.Threading.Tasks.Task[object]]] {
param($t)
# do something with $t.Result here
} );
$task.ContinueWith($continuation);
See documentation on GitHub. (Disclaimer: I wrote it).
This works for me.
Add-Type -AssemblyName 'System.Net.Http'
$myobj = new-object System.Net.Http.HttpClient
$res = $myobj.GetStringAsync("https://google.com").Result
Write-Host "Result : " $res
Perhaps check that PowerShell is configured to use .NET 4:
How can I run PowerShell with the .NET 4 runtime?

Soap Issue - SoapFault exception: [Client] looks like we got no XML document

Ive looked at similar errors and i think its most likely due to a BOM character but to be honest most of the other coding is in a different context and i just dont understand it, im not that familiar with soap and just use it to pull the data then format it in php.
My code is simple:
$activityClient = xpmClient::getModuleInstance('activity', $remoteSessionId, 'xxx.5pmweb.com');
$filter = new stdClass();
$count = 300;
$offset = 0;
$activityList = $activityClient->getList($filter, $offset, $count);
Now the server error shows:
> PHP Fatal error: Uncaught SoapFault exception: [Client] looks like we got no XML document in xxx/caching.php:59\nStack trace:\n
\#0 xxx/caching.php(59): SoapClient->__call('getList', Array)\n
\#1 xxx/caching.php(59): xpmClient->getList(Object(stdClass), 0, '371')\n
\#2 /xxx/reports.php(296): include('/xxx/...')\n
\#3 {main}\n thrown in /xxx/caching.php on line 59
Line 296 on report.php is an include for the caching.php file, line 59 of that file is
$activityList = $activityClient->getList($filter, $offset, $count);
This worked for months without issue so im not sure what changed today. Any ideas how to strip the BOM and still get my data into $activityList as an object so i can access the information?
edit//
The preg replace doesnt work, i guess thats because once i call $activityList the server gives a fatal error and doesnt process anything after that so im trying to fix it AFTER its broke rather than before.
How would i go about doing __getLastResponse()
Ive read the manual but dont understand how to structure it, im pretty sure i need a try catch for the reasons i said preg replace didnt work but i tried a few variations and its doing nothing, im pretty sure the structure is wrong, any pointers or ideas?
I don't know why would BOM cause this but if you want to strip bom here you go
function strip_bom( $str ) {
return preg_replace( '/^(\x00\x00\xFE\xFF|\xFF\xFE\x00\x00|\xFE\xFF|\xFF\xFE|\xEF\xBB\xBF)/', "", $str );
}
The Soap server you are using is broken. Have you checked manually trying to call it?