Consolidate 2 Powershell scripts into one - powershell

I have two Powershell scripts running at once on my PC doing a similar job. Each one looks for a sting pattern in a log file and then writes a text string to a text file. This updates dynamically so, each time the string changes in the log file, it's updated in the output text file. All works great.
I would like to consolidate both scripts into one Powershell script to make things more efficient. Hope this makes sense. Below is an example of one of the Powershell scripts. Both scripts are the same except the $output file name is different and the $patterns are different. I’d like each string to go the a csv file.
$log = "LogFile.txt"
$output = "ExampleOutput.txt"
$patterns = #("GroupEvents: Linedup for group (.*), subgroup",
"Workout(.*)",
"GroupEvents: Started in group (.*), subgroup",
"Run Device Selected: (.*)")
function ProcessLog {
[CmdletBinding()] Param([Parameter(ValueFromPipeline)]$item)
Write-Output "Running with $item"
foreach($pattern in $patterns)
{
$save = (Select-String -InputObject $item -Pattern $pattern | % {
$_.Matches.Groups[1].Value })
if(-not [string]::IsNullOrWhitespace($save))
{
$save.Trim('(').Trim(')') | Set-Content $output
return
}
}
}
Get-Content -Tail 0 -Wait -Encoding "UTF8" $log | % { $_ | ProcessLog }
I realise this may not be as simple as it sounds and any other ideas are welcome. Perhaps each individual string could be written to a different tab in a comma separated csv

Related

How can I write a script that will read other scripts and record their processes?

I have a folder of scripts that are being used for my company, and I need to know what each script does. I am trying to write a script in power shell that will record what each script does into a csv file.
I am a beginner in Powershell and am still learning so I apologize if I am being unclear.
I know that each of these scripts basic function is to map drives to a users computer, but there are too many to go through manually, any advice would be appreciated!
EDIT: Most of them are bat with a couple of vbs too. I want to record what drives are being mapped.
EDIT 2: I have now written my own script that looks like this :
Set-location z:\
get-Childitem "z:\Test"|
Foreach-object{
$filename = $_.Fullname
Get-content $filename|
foreach-object {
if ($_ -match "echo off") {
Write-output "$($filename): $_" | select-object $_ $filename
| export-csv "test.csv" -notypeinformation
}
}
}
I am having trouble exporting the data into a csv file as the error "A positional parameter cannot be found that accepts argument 'z:\Test\Test1.bat'"
The easiest way will be string parsing: look for the commands that map the drives. That's net use for bat files, or MapNetworkDrive in VBS. So look for those lines.
This will look through all the files in a folder and output the filename and the content of the line wherever it finds those lines:
Get-ChildItem "C:\Scripts" |
Foreach-Object {
$filename = $_.FullName
Get-Content $filename |
Foreach-Object {
if ($_ -match "net use" -or $_ -match "MapNetworkDrive") {
Write-Output "$($filename): $_"
}
}
}
That will not likely be exactly what you need, but it should get you started.

Copy specific lines from a text file to separate file using powershell

I am trying to get all the lines from an Input file starting with %% and paste it into Output file using powershell.
Used the following code, however I am only getting last line in Output file starting with %% instead of all the lines starting with %%.
I have only started to learn powershell, please help
$Clause = Get-Content "Input File location"
$Outvalue = $Clause | Foreach {
if ($_ -ilike "*%%*")
{
Set-Content "Output file location" $_
}
}
You are looping over the lines in the file, and setting each one as the whole content of the file, overwriting the previous file each time.
You need to either switch to using Add-Content instead of Set-Content, which will append to the file, or change the design to:
Get-Content "input.txt" | Foreach-Object {
if ($_ -like "%%*")
{
$_ # just putting this on its own, sends it on out of the pipeline
}
} | Set-Content Output.txt
Which you would more typically write as:
Get-Content "input.txt" | Where-Object { $_ -like "%%*" } | Set-Content Output.txt
and in the shell, you might write as
gc input.txt |? {$_ -like "%%*"} | sc output.txt
Where the whole file is filtered, and then all the matching lines are sent into Set-Content in one go, not calling Set-Content individually for each line.
NB. PowerShell is case insensitive by default, so -like and -ilike behave the same.
For a small file, Get-Content is nice. But if you start trying to do this on heavier files, Get-Content will eat your memory and leave you hanging.
Keeping it REALLY simple for other Powershell starters out there, you'll be better covered (and with better performance). So, something likes this would do the job:
$inputfile = "C:\Users\JohnnyC\Desktop\inputfile.txt"
$outputfile = "C:\Users\JohnnyC\Desktop\outputfile.txt"
$reader = [io.file]::OpenText($inputfile)
$writer = [io.file]::CreateText($outputfile)
while($reader.EndOfStream -ne $true) {
$line = $reader.Readline()
if ($line -like '%%*') {
$writer.WriteLine($line);
}
}
$writer.Dispose();
$reader.Dispose();

Powershell INI editing

I want to changing some values in an INI file. Unfortunately, I have keys in 2 different sections which share an identical name but need different values. My code uses the Get-IniContent function from PsIni.
Example INI file:
[PosScreen]
BitmapFile=C:\Temp\Random.bmp
Bitmap=1
[ControlScreen]
BitmapFile=C:\Temp\Random.bmp
Bitmap=1
I need to change the above to the following:
[PosScreen]
BitmapFile=C:\Temp\FileC.bmp
Bitmap=1
[ControlScreen]
BitmapFile=C:\Temp\FileD.bmp
Bitmap=1
The PowerShell code I am using seems to work, but it changes every value to "File D". It is obviously parsing everything twice, and the name is the same for each section.
$NewFileC = "C:\Temp\FileC.bmp"
$NewFileD = "C:\Temp\FileD.bmp"
$POSIniContent = Get-IniContent "C:\scripts\Update-EnablerImage\WINSUITE.INI"
$BOIniContent = Get-IniContent "C:\scripts\Update-EnablerImage\WINSUITE.INI"
If ($POSIniContent["PosScreen"]["BitmapFile"] -ne $NewFileC) {
Get-Content "C:\scripts\Update-EnablerImage\WINSUITE.INI" |
ForEach-Object {$_ -replace "BitmapFile=.+" , "BitmapFile=$NewFileC" } |
Set-Content "C:\scripts\Update-EnablerImage\WINSUITE.INI"
}
If ($BOIniContent["ControlScreen"]["BitmapFile"] -ne $NewFileD) {
Get-Content "C:\scripts\Update-EnablerImage\WINSUITE.INI" |
ForEach-Object {$_ -replace "BitmapFile=.+" , "BitmapFile=$NewFileD" } |
Set-Content "C:\scripts\Update-EnablerImage\WINSUITE.INI"
}
My struggle is how to change each one independently. I'm a bit of a scripting newbie, so calling out for some help. Tried using Conditional Logic (ForEach $line in $INIFile, for example), but no luck with that.
You are overcomplicating things. You can use Get-IniContent and Out-IniFile as follows:
$ini = Get-IniContent c:\temp\ini.ini
$ini["posscreen"]["BitmapFile"] = "C:\Temp\FileC.bmp"
$ini | Out-IniFile -FilePath c:\temp\ini2.ini
Note that if you want to overwrite the original file, you must add -Force to the Out-IniFile call.

Powershell. Writing out lines based on string within the file

I'm looking for a way to export all lines from within a text file where part of the line matches a certain string. The string is actually the first 4 bytes of the file and I'd like to keep the command to only checking those bytes; not the entire row. I want to write the entire row. How would I go about this?
I am using Windows only and don't have the option to use many other tools that might do this.
Thanks in advance for any help.
Do you want to perform a simple "grep"? Then try this
select-string .\test.txt -pattern "\Athat" | foreach {$_.Line}
or this (very similar regex), also writes to an outfile
select-string .\test.txt -pattern "^that" | foreach {$_.Line} | out-file -filepath out.txt
This assumes that you want to search for a 4-byte string "that" at the beginning of the string , or beginning of the line, respectively.
Something like the following Powershell function should work for you:
function Get-Lines {
[cmdletbinding()]
param(
[string]$filename,
[string]$prefix
)
if( Test-Path -Path $filename -PathType Leaf -ErrorAction SilentlyContinue ) {
# filename exists, and is a file
$lines = Get-Content $filename
foreach ( $line in $lines ) {
if ( $line -like "$prefix*" ) {
$line
}
}
}
}
To use it, assuming you save it as get-lines.ps1, you would load the function into memory with:
. .\get-lines.ps1
and then to use it, you could search for all lines starting with "DATA" with something like:
get-lines -filename C:\Files\Datafile\testfile.dat -prefix "DATA"
If you need to save it to another file for viewing later, you could do something like:
get-lines -filename C:\Files\Datafile\testfile.dat -prefix "DATA" | out-file -FilePath results.txt
Or, if I were more awake, you could ignore the script above, use a simpler solution such as the following one-liner:
get-content -path C:\Files\Datafile\testfile.dat | select-string -Pattern "^DATA"
Which just uses the ^ regex character to make sure it's only looking for "DATA" at the beginning of each line.
To get all the lines from c:\somedir\somefile.txt that begin with 'abcd' :
(get-content c:\somedir\somefile.txt) -like 'abcd*'
provided c:\somedir\somefile.txt is not an unusually large (hundreds of MB) file. For that situation:
get-content c:\somedir\somefile.txt -readcount 1000 |
foreach {$_ -like 'abcd*'}

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}