I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog
Related
I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}
Goal: Update text entry on one line within many files distributed on a server
Summary: As part of an application migration between datacenters the .rdp files on end-user desktops need to be updated to point to the new IP address of their Remote Desktop Server. All the .rdp files reside on Windows servers in a redirected folders SMB share where I have Administrative access.
Powershell experience: minimal. Still trying to wrap my head around the way variables, output and piping work.
Was originally trying to make a single line of powershell code to complete this task but got stuck and had to make script file with the two lines of code below.
-Line 1: Search for all .rdp files in the folder structure and store the full path with file name in a variable. Every file will be checked since the users tend to accidentally change file names, eliminating absolute predictability.
-Line 2: I want to make one pass through all the files to replace only instances of two particular IP addresses with the new address. Then write the changes into the original file.
$Path = ls 'C:\Scripts\Replace-RDP\TESTFILES\' -Include *.rdp -Recurse -Force -ErrorAction SilentlyContinue | foreach fullname
$Path | (Get-Content -Path $Path) -Replace 'IPserver1','newIPserver1' -Replace 'IPserver2','newIPserver2' | Set-Content $Path -Force
Have found most of the solution with Powershell but have a problem with the results. The second line of code when output to the screen changes contents correctly in memory. The content written to file however resulted in the new server IP address being written into ALL rdp files even if the source rdp file's target IP address doesn't match the -Replace criterion.
Text inside a .rdp on the relevant line is:
full address:s:192.168.1.123
changes to:
full address:s:172.16.1.23
Thank you for all assistance in reaching the endpoint. Have spent hours learning from various sites and code snippets.
You need to keep track of each file that you are reading so that you can save changes to that file. Foreach-Object makes this process easy. Inside of the Foreach-Object script block, the current object $_ is the FullName value for each of your files.
$CurrentIP1 = '192\.168\.1\.123'
$CurrentIP2 = '192\.168\.1\.124'
$NewIP1 = '172.16.1.23'
$NewIP2 = '172.16.1.24'
$files = (Get-ChildItem 'C:\Scripts\Replace-RDP\TESTFILES\' -Filter *.rdp -Recurse -Force -File -ErrorAction SilentlyContinue).FullName
$files | Foreach-Object {
if (($contents = Get-Content $_) -match "$CurrentIP1|$CurrentIP2") {
$contents -replace $CurrentIP1,$NewIP1 -replace $CurrentIP2,$NewIP2 |
Set-Content $_
}
}
Note that using the -File switch on Get-ChildItem (alias ls) outputs only files. Since -replace uses regex to do matching, you must backslash escape literal . characters.
I have the following code in a script:
Set-Content -Path $myFile -Value $myContent
Later in the script, I wish to show the end result by opening the file in Notepad:
notepad $myFile
I find that the file I see that is launched has the original content, not the one I set in the script. The actual file has the correct content, so it's just a matter of getting stale file content.
I've only noticed this behavior when I started working with a large file, so I think I need to flush or wait after setting the content.
One workaround I found is to use the -PassThru parameter:
$dummy = Set-Content -Path $myFile -Value $myContent -PassThru
However, I find that this takes quite a long time. Is this the trade-off I have to deal with?
I have the following line of code to download a list of files from a text file. It is part of a larger program. Currently, the PS script will grab the files that were downloaded. But if a file does not exist, I would like to pause the PS script so the user can adjust the URL to point a good file, then continue. Is this possible? If not maybe go through the list and output which paths are bad?
Get-Content file.txt | ForEach-Object {Invoke-WebRequest $_ -OutFile C:\test\download\$(Split-Path $_ -Leaf)}
Edit
I will test out the -method HEAD but for now, I found the test-path cmdlet. Still working out the kinks to not display the whole path but this seems to work for me. As it prints out what is found and missing. I will have to work on adding a pause and re run in my program.
Get-Content c:\file.txt | `Select-Object #{Name='FileName';Expression=$_}},#{Name='FolderExist';Expression={ Test-Path $_}}
My PowerShell skills are in their infancy so please bear with me.
What I need to do is take a list of PC's from a text file and check for file existence. Once that has been determined, I need to take those PC's that have the file and check the file for its FileVersion. Then in the end, output that to a CSV file.
Here's what I have and I'm not really sure if this is even how I should be going about it:
ForEach ($system in (Get-Content C:\scripts\systems.txt))
if ($exists in (Test-Path \\$system\c$\Windows\System32\file.dll))
{
Get-Command $exists | fl Path,FileVersion | Out-File c:\scripts\results.csv -Append
}
Not bad for a starter script, you got it almost right. Let's amend it a bit. To get the version info we'll just get a working code from another an answer.
ForEach ($system in (Get-Content C:\scripts\systems.txt)) {
# It's easier to have file path in a variable
$dll = "\\$system\c`$\Windows\System32\file.dll"
# Is the DLL there?
if ( Test-Path $dll){
# Yup, get the version info
$ver = [System.Diagnostics.FileVersionInfo]::GetVersionInfo($dll).FileVersion
# Write file path and version into a file.
Add-Content -path c:\scripts\results.csv "$dll,$ver"
}
}