Notepad opens stale file after Set-Content - powershell

I have the following code in a script:
Set-Content -Path $myFile -Value $myContent
Later in the script, I wish to show the end result by opening the file in Notepad:
notepad $myFile
I find that the file I see that is launched has the original content, not the one I set in the script. The actual file has the correct content, so it's just a matter of getting stale file content.
I've only noticed this behavior when I started working with a large file, so I think I need to flush or wait after setting the content.
One workaround I found is to use the -PassThru parameter:
$dummy = Set-Content -Path $myFile -Value $myContent -PassThru
However, I find that this takes quite a long time. Is this the trade-off I have to deal with?

Related

Need Powershell command to avoid memory error

I need to update a file and remove any string that says "ENCRYPTED = YES", the below command works on a test file but the actual file is 250GB and it runs out of memory. Is there a way to go through part of the file at a time and make the update?
(Get-Content -path "X:\file1.sql") -replace "ENCRYPTED=YES;", ";" | Set-Content -Path X:\file_updated.sql

powershell - read all .sql files in a folder and save them all into a single .sql file without changing line ends or line feeds

I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}

Powershell: copy file without locking

I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog

Powershell send-mailmessage attachment different than source file after calling function

#robocopy backup scripts on each server copy their log files to one of two locations: \\backup_server\backup\logs\success if the backup is
#successful, or \\backup_server\backup\logs\fail if the backup fails. This script looks at those two locations to check for failures or
#missing log files, then reports them to the helpdesk.
Function test-servers {
param ($serverlist)
foreach($server in $serverlist){
if(test-path \\backup_server\backup\logs\success\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $success -force
}
elseif(test-path \\backup_server\backup\logs\fail\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $failure -force
}
else{
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $missing -force
}
}
}
$date = get-date
$loglocation = "The log file archive can be found in \\backup_server\backup\logs\archive\"
$filename = (get-date).tostring("MM-dd-yyyy")
$success = "Error log found, backup was successful."
$failure = "Error log found, backup was completed with errors."
$missing = "Error log not found, check server."
[System.Collections.ArrayList]$Serverlist = Get-Content \\backup_server\backup\logs\serverloglist.txt
remove-item \\backup_server\backup\logs\errorlog.txt
new-item -path \\backup_server\backup\logs\ -name "errorlog.txt" -itemtype "file"
add-content -path \\backup_server\backup\logs\errorlog.txt -value $date -force
test-servers -serverlist $serverlist
add-content -path \\backup_server\backup\logs\errorlog.txt -value $loglocation -force
#email out log file and clean up file system
send-mailmessage -from "no-reply#example.com" -to "helpdesk#example.com" -smtpserver "mail.example.com" -port 25 -subject "Nightly Robocopy Log Files" -Attachments "\\backup_server\backup\logs\errorlog.txt" -body "Nightly server backup completed, see attachment for details"
exit
Background: I work for a small company IT department and we’re managing around 2 dozen generic file servers which perform simple robocopy batch backups every night. No problems there. In an effort to limit the number of “Server_1 has completed its nightly backup” tickets to our helpdesk, I’ve written a quick-and-dirty script that we use to confirm the completion of our nightly backups. All of the log files from our nightly backups are copied to a “success” or “fail” folder, and this script checks each location and notes which files are in which folder and emails a single ticket. Then the backup logs that failed are copied to an archive folder for the current date and a copy of the primary log file copied along with them. This system works fantastically, the attached log file has all of the proper information about which server failed or succeeded.
So here is my dilemma. The log file that is attached to the send-mailmessage email isn’t the same log file as the source file! The log file left over in the \\backup_server\backup\logs\ directory has the current date and time (although a few seconds after the log file sent via email) on line 1 as does the copy sent by send-mailmessage, but says that every server’s nightly log is missing (see image). So somewhere along the line, 2 log files are being created, with the exact same file name and file path, the correct log first, and the incorrect log seconds later. The only way I can see this happening is if there’s a problem with my function. But it’s only called once, so I don’t see how it could write two different files. The script is called by the Task Scheduler on the server, and is running only once each day according to the task history, reinforced by the fact that we only receive one email from the script each day.
For the sake of brevity, I’ve removed the end of the script where the old logs are copied and cleaned up, which consists of a few lines of copy- and remove-item. I’ve gone over the man pages for each cmdlet in the script and delved into most every post involving send-mailmessage and use of functions on SE.
How can a script that only calls the new-item cmdlet once to create a single file instead create two separate instances of the file? Is it an issue with my function, or does send-mailmessage have unknown-to-me effects on its attachments?
As this is my first post on the SE network, I understand it may not conform to all of the rules. I’ve done my best to go over the FAQs and keep the post on topic. If the question can be improved in any way, your advice is welcome.
Log File Example, on the left is the file we receive as the email attachment, the right is the file left \\backup_server\backup\logs\errorlog.txt
Edit: I updated the script, removing the function at the beginning and simply placing the foreach loop where the function was being called to confirm it was not an issue with the function call within the script.
I adjusted the script a bit, and the task ran in the morning as scheduled, issue persists. But after running the script manually for a few tests, both the log in the archive and the emailed log were the same! So it appears the error was not with the script itself, but with the way I am calling it in the task scheduler. As scheduling a task to run a powershell script is a completely different question, I'll consider this one resolved. I'll leave the question up for a week and then delete, in case any of the 3 people who have seen it are interested in the solution.
Edit: So it appears it was entirely user error! And the worst kind of user error. I had the task set up on 2 different servers, one running a slightly older version of the script, running on the same schedule. So the tasks ran, one overwrote the logs from the other, and I was left with the aforementioned headache. Deleted the second task and now everything works properly! Hooray for stupid mistakes.

Powershell add header record

I have a process in SSIS where I create three files.
Header.txt
work.txt
Trailer.txt
Then I use an Execute Process Task to call my Powershell script. I basically need to take the work.txt file and prepend the header record to it (while maintaining integrity of original values in work.txt) and then append the trailer record (which is generated with total row counts, etc.).
Currently I have:
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Get-Content Header.txt, work.txt, Trailer.txt|out-file "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility" -Confirm
This is fine in testing where I only had 1000 rows, but now that I have 67,000 rows the process takes forever.
I was looking at the Add-Content cmdlet but I can't find an example where it adds the header. Can someone assist with the syntax on going to the first line in the file and then adding the content before that first line?
many thanks in advance!
Just to clarify: I would like to build off the work.txt file. This si where the majority of the data is already, so instead of rewriting it all to a new file, I think a copy would make more sense. So in theory I would create all three files. copy the work file to say workfile.txt . Prepend header to workfile, append trailer to workfile, rename workfile.
UPDATE
This seems to work for the trailer.
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
#Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Copy-Item work.txt workfile.txt
#Get-Content Header.txt, work.txt, Trailer.txt|out-file "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Add-Content workfile.txt -value (get-content Trailer.txt)
UPDATE
Also tried:
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
$header = "H:\Documentation\Projects\CVS\StageCVS\Header.txt"
#Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility.txt"
Copy-Item work.txt workfile.txt
#(Get-Content Header.txt, work.txt, Trailer.txt -readcount 1000)|Set-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Add-Content workfile.txt -value (get-content Trailer.txt)
.\workfile.txt = $header + (gc workfile.txt)
This is something that seems so easy but the reality is that it is not due to the underlying filesystem. You are going to need a file buffer or a temp file or if you are really brave you can look at extending the file and transposing the characters. As this guy did in C#.
Insert Text into Existing Files in C#, Without Temp Files or Memory Buffers
http://www.codeproject.com/Articles/17716/Insert-Text-into-Existing-Files-in-C-Without-Temp
So as it turns out out-file and get-content are not very performance enhanced. I found that it was taking over 5 minutes to run 5000 record result set and write/read the data.
When i researched some different performance options for Powershell I found the streamwriter .NET method. For the same process this ran in under 15 seconds.
Being that my result set in production environment would be 70-90000 records this was the approach I took.
Here is what i did:
[IO.Directory]::SetCurrentDirectory("H:\Documentation\Projects\CVS\StageCVS")
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
Copy-Item ".\ELIGFINAL.txt" H:\Documentation\Projects\CVS\StageCVS\archive\ELIGFINAL(Get-Date -f yyyyMMdd).txt
Clear-Content "H:\Documentation\Projects\CVS\StageCVS\ELIGFINAL.txt"
Copy-Item work.txt workfile.txt
Add-Content workfile.txt -value (get-content Trailer.txt)
$work = ".\workfile.txt"
$output = "H:\Documentation\Projects\CVS\StageCVS\ELIGFINAL.txt"
$readerwork = [IO.File]::OpenText("H:\Documentation\Projects\CVS\StageCVS\workfile.txt")
$readerheader = [IO.File]::OpenText("H:\Documentation\Projects\CVS\StageCVS\Header.txt")
try
{
$wStream = New-Object IO.FileStream $output ,'Append','Write','Read'
$writer = New-Object System.IO.StreamWriter $wStream
#$write-host "OK"
}
finally
{
}
$writer.WriteLine($readerheader.ReadToEnd())
$writer.flush()
$writer.WriteLine($readerwork.ReadToEnd())
$readerheader.close()
$readerwork.close()
$writer.flush()
$writer.close()