Powershell tail multiple files command - powershell

I can tail one file via the following command:
Get-Content -Path C:\log1.txt -Tail 10 –Wait
How do I extend this to multiple files, I have tried the following with no luck:
Get-Content -Path C:\log1.txt,C:\log2.txt -Tail 10 –Wait
This will only pick up updates from the first file, not the second.

Based on #mjolinor's comment, I have come up with the following that appears to work,
Workflow My-Tail
{
Param([string[]] $Path)
foreach -parallel ($file in $path)
{
Get-Content -Path $file -Tail 1 -Wait
}
}
My-Tail (dir C:\*.log -Include log1.txt,log2.txt)
However, this has some sort of progress bar that appears...

I can't speak to how efficient this is, but since I'm using PowerShell Core 7.1.3, I can't use Workflows or ForEach -Parallel, but I can use ForEach-Object -Parallel, so I tried it just to see what would happen...
gci -Path C:\ -Filter log*.txt |
% -Parallel {
cat -Wait -Tail 10 -Path $_
} -ThrottleLimit 30
In my case, I had 27 files I needed to monitor, so I chose a number just above that, and this seemed to work.
Just to be sure it was working, I used this, which will output the source file name before each line:
gci -Path C:\ -Filter log*.txt |
% -Parallel {
$file = $_;
cat -Wait -Tail 10 -Path $file |
% { write "$($file.Name): ${_}" }
} -ThrottleLimit 30

I needed tailed output across multiple files and I wanted to try do it in one line,
here's what I eventually came up with:
gci *.txt -recurse | ForEach-Object { Write-Output "$_`n" + $(Get-Content $_ -tail 5) + "`n" }
Its takes a recursive directory listing of all files named *.txt,
writes the file path to console,
then writes the last 5 lines to console.
I didn't need to follow the tails of the files, they weren't being actively written to.

Related

How delete a line having a word quickly in multiple large files using powershell

How delete a line having a word quickly in multiple large files using PowerShell
i am using the below code but it take long time
$files = Get-ChildItem "D:\mjautomation\v19.0\filesdd\"
foreach ($file in $files) {
$c = Get-Content $file.fullname | where { $_ -notmatch "deletethisline" }
$c | Set-Content $file.fullname
The following should be reasonably fast due to use of switch -File, but note that it requires reading each file into memory as a whole (minus the excluded lines):
foreach ($file in Get-ChildItem -File D:\mjautomation\v19.0\filesdd) {
Set-Content $file.FullName -Value $(
switch -Regex -File $file.FullName {
'deletethisline' {} # ignore
default { $_ } # pass line through
}
)
}
If you don't want to read each file into memory in (almost) full, use a [System.IO.StreamWriter] instance, as shown in this answer instead of Set-Content to write to a temporary file, and then replace the original file.
Doing so has the added advantage of avoiding the small risk of data loss that writing back to the original file via in-memory operations bears.
If you want to make do with the - slower - Get-Content cmdlet, use the following; the same caveats as above apply:
foreach ($file in Get-ChildItem -File D:\mjautomation\v19.0\filesdd) {
Set-Content $file.FullName -Value (
#(Get-Content $file.FullName) -notmatch 'deletethisline'
)
}
Note that as an alternative to the foreach loop you can use a single pipeline with the ForEach-Object cmdlet - Get-ChildItem ... | ForEach-Object { <# work with $_ #> } - but doing so is slower (though in many cases that won't matter).

Trying to write a script to find 0kb files, and overwrite them with files from a different drive on the same path - Powershell

I am very new to powershell and trying to write a script that can read all 0kb files in a folder, organize them into a list, and then determine where the same files would be on a different backup drive so that they can be replaced.
I have some code that puts it into a list and then attempts to modify the list so that it pulls from the correct drive, however, I cannot get down the file replacement.
An example of what I want this to do is this:
There is a file at C:\Test\test.txt that is 0kb, empty.
The script reads that this file is empty and writes it to a text document C:\Test2\test2.txt
The text document is edited so that the text is not "C:\Test\test.txt" 0 but rather H:\Test\test.txt
I want to then pull that specific line out of the text document and use it as a file path for a replace action with the original C:\Test\test.txt file
The reason I am trying to do this is a backup/restore gone wrong, some of the files were restored as empty when they have content on the backup drive, unfortunately there are too may to go through and one by one copy/paste the files and since the restore, some files have been significantly edited so I cannot just overwrite everything.
I have the text documents all ready to go and they are being modified as I would like, however, I cannot seem to pull the file paths correctly; nor can I seem to replace the file on the C drive, with a file on the H drive.
Code as follows:
#Determines if a file is empty and writes to txt doc
forfiles /S /P c:\test /M *.* /C "cmd /c If #fsize==0 Echo #path #fsize" | Out-File C:\test\TestoutPreRename.txt
#Replaces "C:" with "H:",""" with "", and "0" with ""
$InFile = 'C:\test\TestoutPreRename.txt'
$OutFile = 'C:\test\TestoutPostRename.txt'
filter replace-chars { $_ -replace 'C:','H:' }
if (test-path $OutFile)
{ Clear-Content $OutFile }
Get-Content $InFile -ReadCount 0 |
replace-chars |
Add-Content $OutFile
filter replace-chars { $_ -replace '"','' }
if (test-path $OutFile)
{ Clear-Content $OutFile }
Get-Content $OutFile -ReadCount 0 |
replace-chars |
Add-Content $OutFile
filter replace-chars { $_ -replace '0','' }
if (test-path $OutFile)
{ Clear-Content $OutFile }
Get-Content $OutFile -ReadCount 0 |
replace-chars |
Add-Content $OutFile
#This should determine the file path and perform the copy action while removing preexsting files with the same name in the destination
#This is where I am having issues
foreach($line in Get-Content C:\test\TestoutPostRename.txt) {
if($line -match $regex){
If (test-path $line) {Remove-Item $line}
[System.IO.File]::Copy($line);
}
}
I know its not clean nor good, but its the best I've got so far, any critiques are appreciated.
Thanks.
You can do something like this. It's worth reading up on these Cmdlets as they may provide additional properties/parameters that you might find helpful
Get-ChildItem
Where-Object
Select-Object
Export-Csv
Copy-Item
$Path = "C:\Test"
# Get the files. The Length property represents size - you what files where this is 0
$files = Get-ChildItem -Path $Path -Recurse | Where-Object {$_.Length -eq 0}
# Export some of the details of the file to a CSV and/or file.
# FullName is the full path of the file
$files | Select-Object FullName,Length | Export-Csv FilesWith0kb.csv -NoTypeInformation
$files | Out-File FilesWith0kb.txt
# Loop over the list, replacing the drive C with H
# Using Copy-Item with -Force will overwrite the files if they already exist
foreach($file in $files){
$backupFile = ($file.FullName).Replace("C:","H:")
Copy-Item -Path $backupFile -Destination $file.FullName -Force
}

Add Text to file when performing Get-ChildItem [duplicate]

This question already has answers here:
Does PowerShell require the opening curly brace of a script block to be on the same line?
(2 answers)
Closed 5 years ago.
I've been looking around trying to solve this issue but nothing has worked so far.
I am wanting to pick 1/multiple files, add a hardcoded value (e.g "DIAB`r`n") to the start of the file content and then write to another volume.
Do I have to do multiple statements or is it something I can do in one.
What I have so far is try using 2 statements:
# Find the file/s .HL7, move to a new destination and change the filename to add "DIAB-"
# to the front - this is working
Get-ChildItem -Path $DiabetesPath |
Where-Object { (!$_.PSIsContainer -and $_.Name -like "*.HL7") } |
Move-Item -Destination $DestinationPath -PassThru |
Rename-Item -NewName {"DIAB-" + $_.Name}
# this is what is not working - after files moved to destination trying to add some text
# to the start of the file content
Get-ChildItem $DestinationPath -Recurse -Filter DIAB*.HL7 | ForEach-Object
{
$content = Get-Content $_.FullName
Set-Content $_.FullName -Value "DIAB`r`n", $content
}
When I try it, it says:
cmdlet ForEach-Object at command pipeline position 2
Supply values for the following parameters
process[0]:
When I enter something it keeps going:
process[1]:
process[2]:
...
Not sure what is going on.
Wow! didn't realise the position of { would affect anything.
Moved the { next to the ForEach-Object and it works.
Get-ChildItem $DestinationPath -Recurse -Filter DIAB*.HL7 | ForEach-Object {
$content = Get-Content $_.FullName
Set-Content $_.FullName -Value "DIAB`r`n", $content
}

Optimize script for extracting records from multiple files using powershell

I am trying to extract records from multiple files using power shell, The script that i have written is iterating through each file and writing the records matching the pattern to an out file. however this is taking long time due to huge number of files
I would like to know if this can be optimized.
$files = Get-ChildItem $sourcedirectory\*
for ($i=0; $i -lt $files.Count; $i++) {
$outfile = $files[$i].FullName + "_out"
Get-Content $files[$i].FullName| Select-String -Pattern "OB_[0-9]F_AHU*" | Set-Content $outfile
}
if (!(Test-Path -path $targetdirectory)) {New-Item $targetdirectory -Type Directory}
Move-Item -Path $sourcedirectory\*_out -Destination $targetdirectory
Could you post some more details about what you are trying to accomplish?
At face value, here is a solution that parses each file in parallel. I'm not sure off the top of my head how many concurrent jobs it will use but this should get you started down this pathway.
Try this:
$files = Get-ChildItem $sourcedirectory\*
foreach -parallel ($file in $files) {
$outfile = $file.FullName + "_out"
Get-Content $file.FullName | Select-String -Pattern "OB_[0-9]F_AHU*" | out-file -Append $outfile
}
As far as your overall goal, sometimes PowerShell is not the best tool for the job. Whenever you want to parse through large amounts of data you should consider dumping that data into a database. You could use something like SQL Express and upload your files 1 time ( the slow operation ) and then be able to parse that data drastically faster from then on. Since I don't know what you are trying to accomplish or what your data looks like, I can't give you a good idea as to whether this is worth it in your case.
You can write the new files directly to the target directory instead of moving them from the source directory.
$sourceDir = "C:\users\you\documents\somefiles"
$targetDir = "C:\users\you\documents\somefiles\targetDir"
if( !(Test-Path $targetDir) ) {
New-Item -Path $targetDir -ItemType d
}
( Get-ChildItem $sourceDir | Select-String -Pattern "OB_[0-9]F_AHU*" ) |
%{ New-Item -Path $targetDir -Name ($_.Filename + "_out") -Value $_.Line}
The output of Select-String will contain the FileName and Line where the match was found, which is all you need to create new files with New-Item inside of the Foreach block %{}.
A small improvement.

how to find all .bat files and execute them one by one?

I am still very new to PowerShell and need some help.
I have some .bat files in a folder called: c:\scripts\run\
and I want to run them one by one but I don't know how many I have, it changes from time to time.
So I want to run a loop with foreach like this:
foreach ($file in get-childitem c:\scripts\run | where {$_.extension -eq ".bat"})
But I don't know how to run them now.
I know that I can run them 1 by 1 like this :
./run1.bat
./run2.bat
./run3.bat
But how do I implement that?
Thanks!!
Try this:
Get-Childitem -Path c:\scripts\run -Filter *.bat | % {& $_.FullName}
You can use
& $file.FullName
within your loop.
I would probably just use a pipeline, though, instead of an explicit foreach loop:
Get-ChildItem C:\scripts\run -Filter *.bat | ForEach-Object { & $_.FullName }
If you want additional checks after each batch file ran:
gci C:\scripts\run -fi *.bat | % {
& $_.FullName
if (Test-Path C:\scripts\run\blah.log) {
...
}
}