How to remove the first 3 symbols in a text file? - powershell

How to remove the first 3 symbols in a text file with PowerShell and keep the file with the same name?

Just read the file using the Get-Content cmdlet, remove the file using a regex that replaces the first three characters with nothing and finally write it back using the Set-Content cmdlet:
(Get-Content 'yourfilePath.txt' -raw) -replace '^...' | Set-Content 'yourfilePath.txt'
Note: You probably want to specify the encoding using the -Encoding parameter when writing the content back to the file.

Related

How to remove white space using powershell in multiple text file in the same folder?

folder name is: c:\home\alltext\
inside has: 2 text files with different names(each text contents extra whitespace that I want to trim)
text1.txt
text2.txt
I don't want to use notepad++ and do one by one text.txt if I have more than 2 command.
I tried PowerShell it returns both text1 and text2 together in same one text.txt.
How can I trim them in one command and return individual txt?
This is my command:
(get-content c:\home\alltext\*.txt).trim() -ne '' | Set-content c:\home\alltext\*.txt
You need to process the input files one by one:
Get-ChildItem c:\home\alltext*.txt | ForEach-Object {
Set-Content -LiteralPath $_.FullName -Value (($_ | Get-Content).Trim() -ne '')
}
Note that PowerShell never preserves the original character encoding when reading text files, so you may have to use the -Encoding parameter with Set-Content.
As for what you tried:
(get-content c:\home\alltext*.txt).trim() -ne '' streams the non-blank lines of all files matching wildcard expression c:\home\alltext*.txt, across file boundaries.
Perhaps surprisingly, not only does Set-Content's (positionally implied) -Path parameter accept wildcard expressions too, it writes the same content (the stringified versions of whatever input it receives) to whatever files happen to match that wildcard expression.
This problematic behavior is discussed in GitHub issue #6729; unfortunately, it was decided to retain the current behavior.

Issues merging multiple CSV files in Powershell

I found a nifty command here - http://www.stackoverflow.com/questions/27892957/merging-multiple-csv-files-into-one-using-powershell that I am using to merge CSV files -
Get-ChildItem -Filter *.csv | Select-Object -ExpandProperty FullName | Import-Csv | Export-Csv .\merged\merged.csv -NoTypeInformation -Append
Now this does what it says on the tin and works great for the most part. I have 2 issues with it however, and I am wondering if there is a way they can be overcome:
Firstly, the merged csv file has CRLF line endings, and I am wondering how I can make the line endings just LF, as the file is being generated?
Also, it looks like there are some shenanigans with quote marks being added/moved around. As an example:
Sample row from initial CSV:
"2021-10-05"|"00:00"|"1212"|"160477"|"1.00"|"3.49"LF
Same row in the merged CSV:
"2021-10-05|""00:00""|""1212""|""160477""|""1.00""|""3.49"""CRLF
So see that the first row has lost its trailing quotes, other fields have doubled quotes, and the end of the row has an additional quote. I'm not quite sure what is going on here, so any help would be much appreciated!
For dealing with the quotes, the cause of the “problem” is that your CSV does not use the default field delimiter that Import-CSV assumes - the C in CSV stands for comma, and you’re using the vertical bar. Add the parameter -Delimiter "|" to both the Import-CSV and Export-CSV cmdlets.
I don’t think you can do anything about the line-end characters (CRLF vs LF); that’s almost certainly operating-system dependent.
Jeff Zeitlin's helpful answer explains the quote-related part of your problem well.
As for your line-ending problem:
As of PowerShell 7.2, there are no PowerShell-native features that allow you to control the newline format of file-writing cmdlets such as Export-Csv.
However, if you use plain-text processing, you can use multi-line strings built with the newline format of interest and save / append them with Set-Content and its -NoNewLine switch, which writes the input strings as-is, without a (newline) separator.
In fact, to significantly speed up processing in your case, plain-text handling is preferable, since in essence your operation amounts to concatenating text files, the only twist being that the header lines of all but the first file should be skipped; using plain-text handling also bypasses your quote problem:
$tokenCount = 1
Get-ChildItem -Filter *.csv |
Get-Content -Raw |
ForEach-Object {
# Get the file content and replace CRLF with LF.
# Include the first line (the header) only for the first file.
$content = ($_ -split '\r?\n', $tokenCount)[-1].Replace("`r`n", "`n")
$tokenCount = 2 # Subsequent files should have their header ignored.
# Make sure that each file content ends in a LF
if (-not $content.EndsWith("`n")) { $content += "`n" }
# Output the modified content.
$content
} |
Set-Content -NoNewLine ./merged/merged.csv # add -Encoding as needed.

How can we add new rows at the top of the CSV file using powershell script?

I am new to powershell scripting and I am looking for a way to add 2 new rows at the top of the already present csv file.
Things that I have tried is replacing the header and rows with the new rows.
I am looking for a way to add 2 new rows above the header in CSV.
You mention that you want to add the new lines above the header, which means that no CSV-specific processing is needed - it sounds like you're asking how to prepend lines to an existing text file (which happens to contain CSV - note that the resulting file will no longer be a valid CSV file).
E.g., assuming a target file named some.csv:
Note: Best to make a backup of the target file before trying these commands.
If the input file is small enough to fit into memory as a whole:
Reading the entire target file into memory as a single string with Get-Content -Raw allows for a convenient and concise solution:
Set-Content -LiteralPath some.csv -NoNewLine -Value (
#'
New line 1 above header
New line 2 above header
'# + (Get-Content -Raw some.csv)
)
Note that Set-Content applies a default character encoding (the active ANSI code page in Windows PowerShell, UTF-8 without BOM in PowerShell Core), irrespective of the current encoding of some.csv, so you may have to use the -Encoding parameter to specify the encoding explicitly.
Also note that the single-quoted here-string (#'<newline>...<newline>'#) uses the same newline style (CRLF (Windows-style) vs. LF (Unix-style)) as the enclosing script, which may not match the style used in some.csv - though PowerShell itself has no problem processing files with mixed newlines styles.
If the file is too large to fit into memory, use a streaming (line-by-line) approach:
$ErrorActionPreference = 'Stop'
# Create a temporary file and fill it with the 2 new lines.
$tempFile = [IO.Path]::GetTempFileName()
'New line 1 above header', 'New line 2 above header' | Set-Content $tempFile
# Then append the CSV file's lines one by one.
Get-Content some.csv | Add-Content $tempFile
# If that succeeded, replace the original file.
Move-Item -Force $tempFile some.csv
Note: Use of the Get-Content, Set-Content and Add-Content cmdlets is convenient, but slow; the next section shows a faster alternative.
If performance matters, use .NET types such as [IO.File] instead:
$ErrorActionPreference = 'Stop'
# Create a temporary file...
$tempFile = [IO.Path]::GetTempFileName()
# ... and fill it with the 2 new lines.
$streamWriter = [IO.File]::CreateText($tempFile)
foreach ($lineToPrepend in 'New line 1 above header', 'New line 2 above header') {
$streamWriter.WriteLine($lineToPrepend)
}
# Then append the CSV file's lines one by one.
foreach ($csvLine in [IO.File]::ReadLines((Convert-Path some.csv))) {
$streamWriter.WriteLine($csvLine)
}
$streamWriter.Dispose()
# If that succeeded, replace the original file.
Move-Item -Force $tempFile some.csv

Batch File to Find and Replace in text file using whole word only?

I am writing a script which at one point has to check in a text file and remove certain strings. So far I have this:
powershell -Command "(gc myFile.txt) -replace 'foo', 'bar' | Out-File -encoding ASCII myFile.txt"
The only problem is that that can find and replace but will not remove the line all together.
The second problem is that say I am removing the line that has Mark, it needs to not remove a line that has something like Markus.
I don't know if this is possible with the powershell interface?
Your current code will only replace foo with bar, this is what replace does.
Removing the whole line if it matches requires a different approach, almost backwards, as you can use notmatch to output any lines that do not match you filter - effectively removing them.
Also using regex word boundaries will then only match Mark but not Markus:
(Get-Content file.txt) | Where-Object {$_ -notmatch "\bMark\b"} | Set-Content file.txt

Multiple powershell scripts outputting to the same text file

I have a set of ps scripts, i will call them parents, that invoke other ps scripts, i will call them children. They all need to write to the same text file, i will call myoutfile.txt. The first output of the parent script should clear myoutputfile.txt. All subsequent child ps scripts should append to myoutputfile.txt, including another parent script later in the logic.
Using Out-File, the parent PS script opens and locks the file myoutputfile.txt. The children PS scripts need append to append to the file, but because the parent created the text file the children PS scripts fail silently to append to the text file.
I have tried Out-File and Add-Content. Add-Content puts out Chinese to the text file.
Either Out-File with append or Add-Content would work.
Unless you are actually trying to write Chinese, the encoding could be wrong.
$File ="PathTo\test.txt"
Get-Date | Out-File $File -Encoding ascii
"TEXT" | Out-File $File -Append -Encoding utf32
Add-Content -Path $File -Value "END" -Encoding Unicode
As you can see in the file, the result varies depending on encoding.
If you're getting Chinese, my guess is bytes.
Likely conversion problems earlier on.
It took a combo of Peter Schneider's comment and Martin van Delft's answer to get it working.