PowerShell - check .ini file, log issues - powershell

Having trouble with two VNC servers switching off MS Logon Groups being forced. I'm troubleshooting the issue, and one thing I want to do is monitor the config .ini file. I'm relatively new to PowerShell and can't quite get this to work.
Basically, I want the script to check the contents of the configuration file (ultravnc.ini) and see if "MSLogonRequired=1" is a string in that file. If not, I want to append the date to a log file. Eventually I'll do some more with this, but this is my basic need. It's not currently working.
# Variables
$outputFile = "vncMSLogonErrors.txt"
$vncConfig = "C:\Program Files (x86)\uvnc bvba\UltraVNC\ultravnc.ini"
$checkString = "MSLogonRequired=1"
# Get VNC Config File, check for MS Logon setting, write date to file if missing
Get-Content $vncConfig
If (-not $checkString)
{Add-Content $outputFile -Value $(Get-Date)}

Shamus Berube's helpful answer is conceptually simple and works well, if you can assume:
that the line of interest is exactly MSLogonRequired=1, with no variations in whitespace.
that if the INI file is subdivided into multiple sections (e.g, [admin]), that the key name MSLogonRequired is unique among the sections, to prevent false positives.
It is therefore generally preferable to use a dedicated INI-file-parsing command; unfortunately:
PowerShell doesn't come with one, though adding one is being debated
in the meantime you can use the popular PsIni third-party module (see this answer for how to install it and for background information):
Using the PsIni module's Get-IniContent function:
Note: Based on the UltraVNC INI-file documentation, the code assumes that the MSLogonRequired entry is inside the [admin] section of the INI file.
# Variables
$outputFile = "vncMSLogonErrors.txt"
$vncConfig = "C:\Program Files (x86)\uvnc bvba\UltraVNC\ultravnc.ini"
# Check the VNC Config File to see if the [admin] section's 'MSLogonRequired'
# entry, if present, has value '1'.
if ((Get-IniContent $vncConfig).admin.MSLogonRequired -ne '1') {
Add-Content $outputFile -Value (Get-Date)
}

# Variables
$outputFile = "vncMSLogonErrors.txt"
$vncConfig = "C:\Program Files (x86)\uvnc bvba\UltraVNC\ultravnc.ini"
$checkString = "MSLogonRequired=1"
if ((get-content $vncconfig) -notcontains $checkString)) { Add-Content $outputFile -Value $(Get-Date) }

Related

Get-Content works when run locally; returns nothing when run on file with length reported on deployment server

I've inherited a deployment script that reads through files with specific names and extensions deployed to our on-prem web servers, looks line-by-line in those text files for string matches that signify they need replacing, and then inserts, in this case, the correct database and catalog names for the specific server being deployed to.
Unfortunately, though the script works fine On My Computer (c)1984, it's not working when run by TFS as part of the deploy process.
I've added an embarrassing amount of debug info, but haven't been able to track down the issue.
Code
Here's the pertinent code:1
$getFilesParams = #{
Path = $SearchDirectory
Include = #(
"*.asa", "*.asp",
"*.aspx", "*.config",
"*.inc", "*.ini",
"appsettings.json"
)
Exclude = "Web.*.config"
Recurse = $True
}
$files = #(Get-ChildItem #getFilesParams)
foreach ($f in $files) {
# DEBUG STARTS HERE
Write-Feedback "`tFile to search for Data Source: $f" -LogFile "$LogFile"
if ($f.Name -eq "appsettings.json") {
try {
Write-Feedback "`t`tFound the appsettings.json file: $($f.Length) $($f.LastWriteTime) $($f.Name)" -LogFile $LogFile
Get-Content $f | % {
Write-Feedback "!!!! Line: $_"
}
Select-String $f -Pattern "Data Source" | % { Write-Feedback "`t`t`tFound data source: $_" }
# was suspicious it'd work the first but not the second time. No, it fails each time I Get-Content.
Get-Content $f | % {
Write-Feedback "#### Line: $_"
}
}
catch {
Write-Feedback "An error occurred with appsettings.json:" -LogFile $LogFile
Write-Feedback $_ -LogFile $LogFile
}
}
# DEBUG ENDS
}
$files = $files |
Select-String -Pattern "Data Source" |
Group-Object path |
Select-Object name
$count = $files.count
if ($count)
{
#"
Found $count files....
Processing:
"# | Write-Feedback -LogFile $LogFile
# etc etc he fixes the cable
}
else
{
' Did not find any databases catalogs!' | Write-Feedback -LogFile $LogFile
}
Then we go line by line through the files in $files. The problem is that my appsettings.json file, which does contain Data Source (okay, it's lowercase right now -- data source), doesn't get captured and no replacements are made.
Note: Write-Feedback is a convenience function that writes to the console (Write-Host) and to a file, if one is given.
Local output
When I run locally, I get what I'm expecting (edited a bit for brevity):
File to search for Data Source: C:\path\appsettings.json
Found the appsettings.json file: 993 01/12/2022 13:04:52 appsettings.json
!!!! Line: "SomeJsonKey": "data source=localhost;initial catalog=SomeDb;integrated security=True",
Found datasource: C:\path\appsettings.json:9: "SomeJsonKey": "data source=localhost;initial catalog=SomeDb;integrated security=True",
#### Line: "SomeJsonKey": "data source=localhost;initial catalog=SomeDb;integrated security=True",
Found 1 files....
Processing:
C:\path\appsettings.json....
Production output
But when it's run as part of the deployment, I get...
File to search for Data Source: E:\path\appsettings.json
Found the appsettings.json file: 762 01/14/2022 15:15:02 appsettings.json
Did not find any databases catalogs!
So it sees appsettings.json , it even knows appsettings.json has a length (they are different files, so the two lengths here aren't an issue), but it won't Get-Content, much less find the line with Data Source in it.
Notepad++ says the file is ANSI, which is fine, I think. There're no extended characters, so that's the same as UTF8, which is what I expected. Don't think that'd break Get-Content.
To be clear, Not necessarily looking for an answer. Just wondering what my next debug(s) step would be.
I've looked up a little about permissions, though I'm not sure how I'd tell which entry from (Get-Acl $Outfile).Access represents the process that's currently running. But I also would expect an exception of some kind if it can't read the file.
Just for fun, here's a picture of the release UI for the TFS server this is running on. If you've seen it before, you'll get it.
I was hoping I could figure this out through straight PowerShell, but
I can check things out there to a degree if that's useful. That said, I don't have full perms on the TFS site. I probably have more running as... whoever I'm running as when the PowerShell (third step) is executed.
(sorry for all the redaction; almost certainly not necessary, but better safe than sorry (sorrier?), etc?)
1 I realize this code can be optimized. Please ignore that; it's, again, inherited, and I'm trying to get it to work with a minimum of churn.

powershell - read all .sql files in a folder and save them all into a single .sql file without changing line ends or line feeds

I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}

Powershell: copy file without locking

I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog

How to do a copy /b in a powershell script to insert a BOM marker, but as a batch for files that match a filter and changes the ext on output?

REASONS WHY THIS IS NOT A DUPLICATE
Since 3 people have already voted to close, I guess I should explain why this question is not a duplicate:
I cannot use cat or >> as these mess up the encoding of the files, which are UTF8 on input and need to be UTF8-BOM on output.
The linked question does not show how to loop through all files that match a given pattern in a directory, and concatenate a single file to each of the matching files on output, plus give the new file a different extension.
Using Set-Content is not Powershell 6 future-proof, since Set-Content will NOT add a BOM marker. In Powershell 5 and below, it sometimes adds a BOM marker and sometimes not, depending on the configuration settings of the executing user. See 'quick note on encoding' at the end of this article.
So in conclusion I am looking for a solution that uses copy (hence the question title) and does NOT use Cat or Set-Content.
I need to loop through certain files in a given directory and run the following on each file:
copy /b BOMMarker.txt+InputFile.dat OutputFile.txt
This inserts the contents of the BOMMarker.txt file at the start of the InputFile.dat and writes the output to OutputFile.txt
I found this question which explains how I can loop through the folder to load each file into Powershell, but how do I apply the "copy /b" command so that I can get the BOM marker at the start of each file?
EDIT
The comment from Jeroen indicates I can just do Set-Content on the output file, as Powershell will automatically add the BOM at the start.
But I also need to change the extension. So the output filename needs to be the same as the input filename, just with a changed extension (from .dat to .txt) and including the BOM.
I am guessing I can use Path.ChangeExtension somehow to do this, but not sure how to combine that with also adding the BOM.
EDIT - for Bounty
The example answer I posted does not work in all environments I tested it, and I do not know why (possibly different default Powershell setttings) but also, it is not future proof since Powershell 6 will not output BOM by default.
From the given directory, I need to process all files that match the filter (DIL_BG_TXN*.dat).
For each of those files, I need to copy it with a BOM at the start but the resultant new file needs to be the same name but with the extension .txt instead of .dat.
This solutions uses streams, that reliably read and write as-is:
$bomStream = [IO.File]::OpenRead('BOMMarker.txt')
$location = "" # set this to the folder location
$items = Get-ChildItem -Path $location -Filter DIL_BG_TXN*.dat
foreach ($item in $items) {
$sourceStream = [IO.File]::OpenRead($item.FullName)
$targetStream = [IO.File]::OpenWrite([IO.Path]::ChangeExtension($item.FullName, '.txt'))
$bomStream.CopyTo($targetStream)
$sourceStream.CopyTo($targetStream)
$targetStream.Flush()
$targetStream.Close()
$sourceStream.Close()
$bomStream.Position = 0
}
$bomStream.Close()
Of course please write the absolute path of BOMMarker.txt (1st line) according to its location.
This finally worked:
$Location = "C:\Code\Bulgaria_Test"
$items = Get-ChildItem -Path $Location -Filter DIL_BG_TXN*.dat
ForEach ($item in $items) {
Write-Host "Processing file - " $item
cmd /c copy /b BOMMarker.txt+$item ($item.BaseName + '.txt')
}
Description:
Set the directory location where all the .dat files are.
Load only those files that match the filter into the array $items.
Loop through each $item in the array.
With each $item, call cmd shell with the copy /b command and concatenate the bom marker file with the $item file and write the result to the basename of $item plus the new extension.

PowerShell script - check multiple PC's for file existence then get that file version

My PowerShell skills are in their infancy so please bear with me.
What I need to do is take a list of PC's from a text file and check for file existence. Once that has been determined, I need to take those PC's that have the file and check the file for its FileVersion. Then in the end, output that to a CSV file.
Here's what I have and I'm not really sure if this is even how I should be going about it:
ForEach ($system in (Get-Content C:\scripts\systems.txt))
if ($exists in (Test-Path \\$system\c$\Windows\System32\file.dll))
{
Get-Command $exists | fl Path,FileVersion | Out-File c:\scripts\results.csv -Append
}
Not bad for a starter script, you got it almost right. Let's amend it a bit. To get the version info we'll just get a working code from another an answer.
ForEach ($system in (Get-Content C:\scripts\systems.txt)) {
# It's easier to have file path in a variable
$dll = "\\$system\c`$\Windows\System32\file.dll"
# Is the DLL there?
if ( Test-Path $dll){
# Yup, get the version info
$ver = [System.Diagnostics.FileVersionInfo]::GetVersionInfo($dll).FileVersion
# Write file path and version into a file.
Add-Content -path c:\scripts\results.csv "$dll,$ver"
}
}