Is this code a Keylogger? What does it do? - powershell

Due to Windows10 task manager I have a powershell.exe running which is continously consuming 8% CPU and blocking 64MB of RAM. After inspecting my Windows event log I found a pipeline event (800) with subsequent code:
Add-Type -AssemblyName System.Core
function Run-Server() {
param([string]$h);
$b = New-Object byte[] 8;
$p = New-Object System.IO.Pipes.AnonymousPipeClientStream -ArgumentList #([System.IO.Pipes.PipeDirection]::In, $h);
if ($p) {
$l = $p.Read($b, 0, 8); while ($l -gt 7) {
$c = [System.BitConverter]::ToInt32($b, 0); $l = System.BitConverter]::ToInt32($b, 4);
$t = $null; if ($l -gt 0) {
$t1 = New-Object byte[] $l;
$l = $p.Read($t1, 0, $t1.Length);
$t = [System.Text.Encoding]::UTF8.GetString($t1, 0, $l) }
if ($c -eq 1) { Invoke-Expression $t } elseif ($c -eq 9) { break } $l = $p.Read($b, 0, 8) }
$p.Dispose()
}
} Run-Server -h 728
I'm working in a corporate environment and I'm not a Powershell expert, but it seems as the script is catching byte by byte and make a string out of it? Do you have any idea what this script could be used for? Do you think it can cause the given indication of 8% CPU and 64MB RAM usage?

I formatted the code, changed the variable names and added some comments to make it easier to understand:
Add-Type -AssemblyName System.Core
function Run-Server() {
param(
[string]$h
);
$buffer = New-Object byte[] 8;
# Creates an annonymous pipe
$pipe = New-Object System.IO.Pipes.AnonymousPipeClientStream -ArgumentList #([System.IO.Pipes.PipeDirection]::In, $h);
if ($pipe) {
# Read up to 8 bytes from the pipe
$readBytes = $pipe.Read($buffer,0, 8); #(byte[] buffer, int offset, int count);
# if it managed to read 8 bytes
while ($readBytes -gt 7) {
# Seems the sender is sending some kind of 'command' or instruction.
# If command is '1' means execute the rest as a script
# If command is '9' means terminate
$command = [System.BitConverter]::ToInt32($buffer,0);
# Seems that in position 4 it sends how big the text will be
$textSize = [System.BitConverter]::ToInt32($buffer,4); # ToInt32 (byte[] value, int startIndex);
# based on the $textSize, read the full message and convert it to string ($text)
$text = $null;
if ($readBytes -gt 0) {
$text1 = New-Object byte[] $textSize;
$readBytes = $pipe.Read($text1, 0, $text1.Length);
$text = [System.Text.Encoding]::UTF8.GetString($text1, 0, $readBytes)
}
if ($command -eq 1) {
# Scary! execute the text string that came from the pipe
Invoke-Expression $text
}
elseif ($command -eq 9) {
break
}
$readBytes = $pipe.Read($buffer,0, 8)
}
$pipe.Dispose()
}
}
Run-Server -h 728
Infor about pipe: AnonymousPipeClientStream Class
That codes creates an In pipe with handle 728 and receives a script from another process, then it executes the script
Some details:
The first message seems to be a kind of command ($c) and an indication of how big the script will be ($l)
Then it reads a second message of size ($l) and, if command == 1, it executes the second message as if it would be a powershell script: Invoke-Expression $t (scary!)

Folks, I'm from Snow Software and can confirm that this is a legit code executed by Snow Inventory Agent to run PowerShell scripts that are deployed with the agents for gathering more advanced information about the device and certain apps installed on it. It does indeed run the anonymous pipe and send the Powershell code as text sourced from the encrypted script files that are deployed together with the agent. The gathered data is used by Snow Software and Technology Asset Management product suite and is deployed by large organizations to optimize technology spend, get visibility, and manageability of the technology assets.
Let me know if you have more questions!

I happened to run into the same issue. After some digging through my system (grep), I found
out that the offending code occurs in an executable 'snowagent.exe'. As far as I can tell it is used by our (company) IT department to get an inventory of the applications installed on my machine, and maybe more.
As such, I conclude that it is at least not a big issue (virus/malware). Still, if I am
hampered by it (i.e. eating away 13% CPU), I just kill it.
gr M.

Related

Parsing Large Text Files Eventually Leading to Memory and Performance Issues

I'm attempting to work with large text files (500 MB - 2+ GB) that contain multi line events and sending them out VIA syslog. The script I have so far seems to work well for quite a while, but after a while it's causing ISE (64 bit) to not respond and use up all system memory.
I'm also curious if there's a way to improve the speed as the current script only sends to syslog at about 300 events per second.
Example Data
START--random stuff here
more random stuff on this new line
more stuff and things
START--some random things
additional random things
blah blah
START--data data more data
START--things
blah data
Code
Function SendSyslogEvent {
$Server = '1.1.1.1'
$Message = $global:Event
#0=EMERG 1=Alert 2=CRIT 3=ERR 4=WARNING 5=NOTICE 6=INFO 7=DEBUG
$Severity = '10'
#(16-23)=LOCAL0-LOCAL7
$Facility = '22'
$Hostname= 'ServerSyslogEvents'
# Create a UDP Client Object
$UDPCLient = New-Object System.Net.Sockets.UdpClient
$UDPCLient.Connect($Server, 514)
# Calculate the priority
$Priority = ([int]$Facility * 8) + [int]$Severity
#Time format the SW syslog understands
$Timestamp = Get-Date -Format "MMM dd HH:mm:ss"
# Assemble the full syslog formatted message
$FullSyslogMessage = "<{0}>{1} {2} {3}" -f $Priority, $Timestamp, $Hostname, $Message
# create an ASCII Encoding object
$Encoding = [System.Text.Encoding]::ASCII
# Convert into byte array representation
$ByteSyslogMessage = $Encoding.GetBytes($FullSyslogMessage)
# Send the Message
$UDPCLient.Send($ByteSyslogMessage, $ByteSyslogMessage.Length) | out-null
}
$LogFiles = Get-ChildItem -Path E:\Unzipped\
foreach ($File in $LogFiles){
$EventCount = 0
$global:Event = ''
switch -Regex -File $File.fullname {
'^START--' { #Regex to find events
if ($global:Event) {
# send previous events' lines to syslog
write-host "Send event to syslog........................."
$EventCount ++
SendSyslogEvent
}
# Current line is the start of a new event.
$global:Event = $_
}
default {
# Event-interior line, append it.
$global:Event += [Environment]::NewLine + $_
}
}
# Process last block.
if ($global:Event) {
# send last event's lines to syslog
write-host "Send last event to syslog-------------------------"
$EventCount ++
SendSyslogEvent
}
}
There are a couple of real-bad things in your script, but before we get to that let's have a look at how you can parameterize your syslog function.
Parameterize your functions
Scriptblocks and functions in powershell support optionally typed parameter declarations in the aptly named param-block.
For the purposes if this answer, let's focus exclusively on the only thing that ever changes when you invoke the current function, namely the message. If we turn that into a parameter, we'll end up with a function definition that looks more like this:
function Send-SyslogEvent {
param(
[string]$Message
)
$Server = '1.1.1.1'
$Severity = '10'
$Facility = '22'
# ... rest of the function here
}
(I took the liberty of renaming it to PowerShell's characteristic Verb-Noun command naming convention).
There's a small performance-benefit to using parameters rather than global variables, but the real benefit here is that you're going to end up with clean and correct code, which will save you a headache for the rest.
IDisposable's
.NET is a "managed" runtime, meaning that we don't really need to worry about resource-management (allocating and freeing memory for example), but there are a few cases where we have to manage resources that are external to the runtime - such as network sockets used by an UDPClient object for example :)
Types that depend on these kinds of external resources usually implement the IDisposable interface, and the golden rule here is:
Who-ever creates a new IDisposable object should also dispose of it as soon as possible, preferably at latest when exiting the scope in which it was created.
So, when you create a new instance of UDPClient inside Send-SyslogEvent, you should also ensure that you always call $UDPClient.Dispose() before returning from Send-SyslogEvent. We can do that with a set of try/finally blocks:
function Send-SyslogEvent {
param(
[string]$Message
)
$Server = '1.1.1.1'
$Severity = '10'
$Facility = '22'
$Hostname= 'ServerSyslogEvents'
try{
$UDPCLient = New-Object System.Net.Sockets.UdpClient
$UDPCLient.Connect($Server, 514)
$Priority = ([int]$Facility * 8) + [int]$Severity
$Timestamp = Get-Date -Format "MMM dd HH:mm:ss"
$FullSyslogMessage = "<{0}>{1} {2} {3}" -f $Priority, $Timestamp, $Hostname, $Message
$Encoding = [System.Text.Encoding]::ASCII
$ByteSyslogMessage = $Encoding.GetBytes($FullSyslogMessage)
$UDPCLient.Send($ByteSyslogMessage, $ByteSyslogMessage.Length) | out-null
}
finally {
# this is the important part
if($UDPCLient){
$UDPCLient.Dispose()
}
}
}
Failing to dispose of IDisposable objects is one of the surest way to leak memory and cause resource contention in the operating system you're running on, so this is definitely a must, especially for performance-sensitive or frequently invoked code.
Re-use instances!
Now, I showed above how you should handle disposal of the UDPClient, but another thing you can do is re-use the same client - you'll be connecting to the same syslog host every single time anyway!
function Send-SyslogEvent {
param(
[Parameter(Mandatory = $true)]
[string]$Message,
[Parameter(Mandatory = $false)]
[System.Net.Sockets.UdpClient]$Client
)
$Server = '1.1.1.1'
$Severity = '10'
$Facility = '22'
$Hostname= 'ServerSyslogEvents'
try{
# check if an already connected UDPClient object was passed
if($PSBoundParameters.ContainsKey('Client') -and $Client.Available){
$UDPClient = $Client
$borrowedClient = $true
}
else{
$UDPClient = New-Object System.Net.Sockets.UdpClient
$UDPClient.Connect($Server, 514)
}
$Priority = ([int]$Facility * 8) + [int]$Severity
$Timestamp = Get-Date -Format "MMM dd HH:mm:ss"
$FullSyslogMessage = "<{0}>{1} {2} {3}" -f $Priority, $Timestamp, $Hostname, $Message
$Encoding = [System.Text.Encoding]::ASCII
$ByteSyslogMessage = $Encoding.GetBytes($FullSyslogMessage)
$UDPCLient.Send($ByteSyslogMessage, $ByteSyslogMessage.Length) | out-null
}
finally {
# this is the important part
# if we "borrowed" the client from the caller we won't dispose of it
if($UDPCLient -and -not $borrowedClient){
$UDPCLient.Dispose()
}
}
}
This last modification will allow us to create the UDPClient once and re-use it over and over again:
# ...
$SyslogClient = New-Object System.Net.Sockets.UdpClient
$SyslogClient.Connect($SyslogServer, 514)
foreach($file in $LogFiles)
{
# ... assign the relevant output from the logs to $message, or pass $_ directly:
Send-SyslogEvent -Message $message -Client $SyslogClient
# ...
}
Use a StreamReader instead of a switch!
Finally, if you want to minimize allocations while slurping the files, for example use File.OpenText() to create a StreamReader to read the file line-by-line:
$SyslogClient = New-Object System.Net.Sockets.UdpClient
$SyslogClient.Connect($SyslogServer, 514)
foreach($File in $LogFiles)
{
try{
$reader = [System.IO.File]::OpenText($File.FullName)
$msg = ''
while($null -ne ($line = $reader.ReadLine()))
{
if($line.StartsWith('START--'))
{
if($msg){
Send-SyslogEvent -Message $msg -Client $SyslogClient
}
$msg = $line
}
else
{
$msg = $msg,$line -join [System.Environment]::NewLine
}
}
if($msg){
# last block
Send-SyslogEvent -Message $msg -Client $SyslogClient
}
}
finally{
# Same as with UDPClient, remember to dispose of the reader.
if($reader){
$reader.Dispose()
}
}
}
This is likely going to be faster than the switch, although I doubt you'll see much improvement to the memory foot-print - simply because identical strings are interned in .NET (they're basically cached in a big in-memory pool).
Inspecting types for IDisposable
You can test if an object implements IDisposable with the -is operator:
PS C:\> $reader -is [System.IDisposable]
True
Or using Type.GetInterfaces(), as suggested by the TheIncorrigible1
PS C:\> [System.Net.Sockets.UdpClient].GetInterfaces()
IsPublic IsSerial Name
-------- -------- ----
True False IDisposable
I hope the above helps!
Here's an example of a way to switch over a file one line at a time.
get-content file.log | foreach {
switch -regex ($_) {
'^START--' { "start line is $_"}
default { "line is $_" }
}
}
Actually, I don't think switch -file is a problem. It seems to be optimized not to use too much memory according to "ps powershell" in another window. I tried it with a one gig file.

Use Powershell to delete Solr cores beginning with a prefix

Is there a way to use Powershell 2.0 to automate deletion of Solr cores that begin with a given prefix? For example, I would like to delete all cores that begin with "some_prefix". I am using Solr 4.10.1, and want to delete cores with the following API call:
http://localhost:8983/solr/admin/cores?action=UNLOAD&deleteIndex=true&deleteInstanceDir=true&core=some_prefix_etc
This script will work:
param ($prefix = $(throw "-prefix is required"))
$client = (New-Object System.Net.WebClient)
[xml]$coresXML = $client.DownloadString("http://localhost:8983/solr/admin/cores")
$cores = $coresXML.response.lst[2].lst | % {$_.name}
$success = 0
$error = 0
foreach ($core in $cores) {
if ($core.StartsWith($prefix)) {
$url = "http://localhost:8983/solr/admin/cores?action=UNLOAD&deleteIndex=true&deleteInstanceDir=true&core=$core"
write-host "Deleting $core :"
$client.DownloadString($url)
if ($?) {$success++}
else $error++
}
}
write-host "Deleted $success cores. Had $error errors."
See this on how the syntax to extract cores to a list works, and this on the Solr UNLOAD API options for deleting a core.

Pass information between independently running Powershell scrips

Sorry for being wage before. I'll try again:
The circumstances are too complicated to explain, but basically the problem is:
how to pass a string (max 20 chars) from one script, to another script running on the same machine.
the two scripts are running continuously in the background, on the same machine, under the same user context,
but can not be combined.
I can not dot-source one script in the other.
I have done it by having one script create a file with the string in it, in a directory monitored by the other script. So when it appears, the other script reads the information.
It works, but it feels "dirty". :)
I was wondering if there is a "best practice"-way to pass information between scripts or at least a more elegant way.
Thanks.
There are several ways for enabling two continuously running processes on the same host to communicate with each other, for instance named pipes:
# named pipe - server
$name = 'foo'
$namedPipe = New-Object IO.Pipes.NamedPipeServerStream($name, 'Out')
$namedPipe.WaitForConnection()
$script:writer = New-Object IO.StreamWriter($namedPipe)
$writer.AutoFlush = $true
$writer.WriteLine('something')
$writer.Dispose()
$namedPipe.Dispose()
# named pipe - client
$name = 'foo'
$namedPipe = New-Object IO.Pipes.NamedPipeClientStream('.', $name, 'In')
$namedPipe.Connect()
$script:reader = New-Object IO.StreamReader($namedPipe)
$reader.ReadLine()
$reader.Dispose()
$namedPipe.Dispose()
or TCP sockets:
# TCP socket - server
$addr = [ipaddress]'127.0.0.1'
$port = 1234
$endpoint = New-Object Net.IPEndPoint ($addr, $port)
$server = New-Object Net.Sockets.TcpListener $endpoint
$server.Start()
$cn = $server.AcceptTcpClient()
$stream = $cn.GetStream()
$writer = New-Object IO.StreamWriter($stream)
$writer.WriteLine('something')
$writer.Dispose()
$server.Stop()
# TCP socket - client
$server = '127.0.0.1'
$port = 1234
$client = New-Object Net.Sockets.TcpClient
$client.Connect($server, $port)
$stream = $client.GetStream()
$reader = New-Object IO.StreamReader($stream)
$reader.ReadLine()
$reader.Dispose()
$client.Dispose()
The easy solution would be to make the second script a function, then dot source it and just capture the return value into a variable.
I'm not entirely sure if this is possible otherwise, you could try $global:variableName or possibly run something as a job. If none of that works you could make the second script store the result in a file then access that file from the first script.
While your question is not very clear, i'll try to answer it.
It seems like you want to continue processing on your first script while your second script is doing (something).
You can put a specific operation in a Job (sort of like a thread) and receive it later with a while ($true) loop specifying conditions that meet your needs then break out of the loop after receiving the results from that Job, or thread.
Take a look at Get-Help | Start-Job for more info on that, or try to hit up google.
You can also import user-defined functions from another script by doing an import-module '.\pathtoscriptmodule.psm1' or just .\Pathtoscriptdefiningfunctions.ps1 to import functions you want to use from an outside script file.
Examples of using Start job..
$scriptblock = {
param($myParam);
# My commands here
}
Start-Job -ScriptBlock $scriptblock -args $myParamsIWantToPassToScriptblock
While ($true){
# To view status of that background job
Get-Job *
# Add your logic for fulfilling your conditons here, then
# break; <<uncomment to break out of loop
if (conditions met)
{
break;
}
}
# Gets the output from that job. -Keep keeps the output in memory
# if you want to call it multiple times
Receive-Job * -Keep
You could look into using runspaces.
http://learn-powershell.net/2013/04/19/sharing-variables-and-live-objects-between-powershell-runspaces/

Understanding performance impact of "write-output"

I'm writing a Powershell script (PS version 4) to parse and process IIS log files, and I've come across an issue I don't quite understand: write-output seems to add significant processing time to the script. The core of it is this (there is more, but this demonstrates the issue):
$file = $args[0]
$linecount = 0
$start = [DateTime]::Now
$reader = [IO.File]::OpenText($file)
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
$linecount++
if (0 -eq ($linecount % 10000)) {
$batch = [DateTime]::Now
[Console]::Error.WriteLine(" Processed $linecount lines ($file) in $(($batch - $start).TotalMilliseconds)ms")
$start = $batch
}
$parts = $line.split(' ')
$out = "$file,$($parts[0]) $($parts[1]),$($parts[2]),$($parts[3]),$($parts[4]),$($parts[5]),$($parts[6]),$($parts[7])"
## Send the output out - comment in/out the desired output method
## Direct output - roughly 10,000 lines / 880ms
$out
## Via write-output - roughly 10,000 lines / 1500ms
write-output $out
}
$reader.Close()
Invoked as .\script.ps1 {path_to_340,000_line_IIS_log} > bob.txt; progress/performance timings are given on stderr.
The script above has two output lines - the write-output one is reporting 10,000 lines every 1500ms, whereas the line that does not have write-output takes as good as half that, averaging about 880ms per 10,000 lines.
I thought that an object defaulted to write-output if it had no other thing (i.e., I thought that "bob" was equivalent to write-output "bob"), but the times I'm getting argue against this.
What am I missing here?
Just a guess, but:
Looking at the help on write-output
Write-Output [-InputObject] <PSObject[]> [-NoEnumerate] [<CommonParameters>]
You're giving it an list of objects as an argument, so it's having to spend a little time assembling to them into an array internally before it does the write, whereas simply outputting them just streams them to the pipeline immediately. You could pipe them to Write-Object, but that's going to add another pipeline which might be even worse.
Edit
In addition you'll find that it's adding .062ms per operation (1500 -880)/10000. You have to scale that to very large data sets before it becomes noticeable.

PowerShell: select string in standard output?

PS C:\squid\sbin> .\squid.exe -v
Squid Cache: Version 2.7.STABLE8
configure options: --enable-win32-service --enable-storeio='ufs aufs null coss' --enable-default-hostsfile=none --enable
-removal-policies='heap lru' --enable-snmp --enable-htcp --disable-wccp --disable-wccpv2 --enable-useragent-log --enable
-referer-log --enable-cache-digests --enable-auth='basic ntlm digest negotiate' --enable-basic-auth-helpers='LDAP NCSA m
swin_sspi squid_radius_auth' --enable-negotiate-auth-helpers=mswin_sspi --enable-ntlm-auth-helpers='mswin_sspi fakeauth'
--enable-external-acl-helpers='mswin_ad_group mswin_lm_group ldap_group' --enable-large-cache-files --enable-digest-aut
h-helpers='password LDAP eDirectory' --enable-forw-via-db --enable-follow-x-forwarded-for --enable-arp-acl --prefix=c:/s
quid
Compiled as Windows System Service.
PS C:\squid\sbin> .\squid.exe -v|Select-String Squid
squid.exe -v will output its version information, which contains keyword "Squid".
I want powershell to tell me whether keyword "Squid" exists in the output. So I use .\squid.exe -v|Select-String Squid, but it outputs nothing.
What's the right way to do it? I'm using PS 3.0.
You ARE doing it the right way :)
The problem is not your code but the squid port itself. Its doing something weird to write text to the console to where PowerShell and cmd can't capture it through the stdout/stderr streams. I'm guessing instead of using the stdout/stderr api it may be manipulating characters on the console directly or something. I tried redirecting stderr to stdout (2>&1) but that didn't work either.
It comes with a change log text file, I guess you can just parse that instead...
EDIT --
Or you can use this kludgy but serviceable workaround to scrape the console text:
function Get-ConsoleText {
if ($host.Name -eq 'ConsoleHost') {
$text_builder = new-object system.text.stringbuilder
$buffer_width = $host.ui.rawui.BufferSize.Width
$buffer_height = $host.ui.rawui.CursorPosition.Y
$rec = new-object System.Management.Automation.Host.Rectangle 0,0,($buffer_width -2), $buffer_height
$buffer = $host.ui.rawui.GetBufferContents($rec)
$console_out = #()
for($i = 0; $i -lt $buffer_height; $i++) {
$text_builder = new-object system.text.stringbuilder
for($j = 0; $j -lt $buffer_width; $j++) {
$cell = $buffer[$i,$j]
$text_builder.Append($cell.Character) | Out-Null
}
$console_out += $text_builder.ToString()
}
return $console_out
}
}
cls; .\squid.exe -v; Get-ConsoleText |
ForEach-Object {
if ($_ -match 'Version (.+)') {$matches[1]}
}