I'm making a script that will find and replace all the instances of a word with another. However I'm unsure how to save the changes.
$file = Get-Content "C:\Script.dat" -Raw
$old = 'oldword'
$new = 'newword'
$file.Replace($old,$new)
Initially I had used the following but this caused issues.
$file.Replace($old,$new) | Set-Content $file
This caused the issue the error of
Set-Content : Cannot find drive. A drive with the same *some random stuff*...
How would I be able to save the changes and/or fix the above issue?
$file = Get-Content "C:\Script.dat" -Raw
$old = 'oldword'
$new = 'newword'
$file.Replace($old,$new) | Out-File -FilePath C:\Script.dat
You were very close, but Set-Content needs two things: a path to the file location and the value to store. Personally, I prefer to overwrite variables when using the .Replace() method instead of piping it into other cmdlets.
This will do it:
$file = Get-Content "C:\Script.dat" -Raw
$old = 'oldword'
$new = 'newword'
$file = $file.Replace($old,$new)
Set-Content -Path "C:\Script.dat" -Value $file
If possible, try to avoid storing files directly at C:\ since that often needs admin rights to write to.
Additionally, you could pipe to Set-Content in a similar way originally listed but you still need to give it the path to the file:
$file.Replace($old,$new) | Set-Content "C:\Script.dat"
Related
I'm generating two files, userscript.meta.js and userscript.user.js. I need the output of userscript.meta.js to be placed at the very beginning of userscript.user.js.
Add-Content doesn't seem to accept a parameter to prepend and Get-Content | Set-Content will fail because userscript.user.js is being used by Get-Content.
I'd rather not create an intermediate file if it's physically possible to have a clean solution.
How to achieve this?
The Subexpression operator $( ) can evaluate both Get-Content statements which are then enumerated and passed through the pipeline to Set-Content:
$(
Get-Content userscript.meta.js -Raw
Get-Content userscript.user.js -Raw
) | Set-Content userscript.user.js
Consider using the Absolute Path of the files if your current directory is not where those files are.
An even more simplified approach than the above would be to put the paths in the desired order since both, the -Path and -LiteralPath parameters can take multiple values:
(Get-Content userscript.meta.js, userscript.user.js -Raw) |
Set-Content userscript.user.js
And in case you want to get rid of excess leading or trailing white-space, you can include the String.Trim Method:
(Get-Content userscript.meta.js, userscript.user.js -Raw).Trim() |
Set-Content userscript.user.js
Note that in above examples the grouping operator ( ) is mandatory as we need to consume all output from Get-Content before being passed through the pipeline to Set-Content. See Piping grouped expressions for more details.
For future folks, here's a snippet if you need to prepend the same thing to multiple files:
example: prepending an #include directive to a bunch of auto-generated C++ files so it works with my Windows environment.
Get-ChildItem -Path . -Filter *.cpp | ForEach-Object {
$file = $_.FullName
# the -Raw param was important for me as it didn't read the entire
# file properly without it. I even tried [System.IO.File]::ReadAllText
# and got the same thing, so there must have been some characater that
# caused the file read to return prematurely
$content = Get-Content $file -Raw
$prepend = '#include "stdafx.h"' + "`r`n"
#this could also be from a file: aka
# $prepend = Get-Content 'path_to_my_file_used_for_prepending'
$content = $prepend + $content
Set-Content $file $content
}
I am using this Powershell script to replace the content of a file:
$lines = Get-Content -Path D:\file.txt -Encoding UTF8 -Raw
$option = [System.Text.RegularExpressions.RegexOptions]::Singleline
$pattern1 = [regex]::new("(\[\.dfn \.term])#(.*?)#", $option)
$lines = $pattern1.Replace($lines, '$1_$2_')
$pattern2 = [regex]::new("(\[what you want])#(.*?)#", $option)
$lines = $pattern2.Replace($lines, '$1*$2*')
It is supposed to find some content in the file and overwrite the file. But it will not overwrite.
However, if I use the script like this:
$lines = Get-Content -Path D:\file.txt -Encoding UTF8 -Raw
$option = [System.Text.RegularExpressions.RegexOptions]::Singleline
$pattern1 = [regex]::new("(\[\.dfn \.term])#(.*?)#", $option)
$lines = $pattern1.Replace($lines, '$1_$2_')
$pattern2 = [regex]::new("(\[what you want])#(.*?)#", $option)
$lines = $pattern2.Replace($lines, '$1*$2*')
$lines | Set-Content -Path D:\result.txt -Encoding UTF8
The script will create a new file and write the result into it. And it will even overwrite result.txt every time the script is run. But if I explicitly say that I want to write the result to file.txt, it will not do that.
How to make the script overwrite the existing file?
My main tests were in Powershell version 5.1.19041.610
No explicit errors in the PS window and no changes to the file. But the file is replaced if I add $lines | Set-Content -Path D:\result.txt -Encoding UTF8 -Force
I also tested it on Powershell version 7.1.3.0.
No explicit errors in the PS window and no changes to the file. But the file is replaced if I add $lines | Set-Content -Path D:\result.txt -Encoding UTF8 without Force.
normally, Get-content opens, reads and closes the file
you could try
$lines | Set-Content -Path D:\result.txt -Encoding UTF8 -Force
-Force
Override restrictions that prevent the command from succeeding. Force will replace
the contents of a file, even if the file is read-only, but will not override
security permissions. Without this parameter, read-only files are not changed.
I have this code that works like a charm for small files. It just dumps the whole file into memory, replaces NUL and writes back to the same file. This is not really very practical for huge files when file size is larger than the available memory. Can someone help me convert it to a streaming model such that it won't choke for huge files.
Get-ChildItem -Path "Drive:\my\folder\path" -Depth 2 -Filter *.csv |
Foreach-Object {
$content = Get-Content $_.FullName
#Replace NUL and save content back to the original file
$content -replace "`0","" | Set-Content $_.FullName
}
The way you have this structured the entire file contents have to be read into memory. Note: That reading a file into memory uses 3-4x the file size in RAM, which's documented here.
Without getting into .Net classes, particularly [System.IO.StreamReader], Get-Content is actually very memory efficient, you just have to leverage the pipeline so you don't build up the data in memory.
Note: if you do decide to try StreamReader, the article will give you some syntax clues. Moreover, that topic has been covered by many others on the web.
Get-ChildItem -Path "C:\temp" -Depth 2 -Filter *.csv |
ForEach-Object{
$CurrentFile = $_
$TmpFilePath = Join-Path $CurrentFile.Directory.FullName ($CurrentFile.BaseName + "_New" + $CurrentFile.Extension)
Get-Content $CurrentFile.FullName |
ForEach-Object{ $_ -replace "`0","" } |
Add-Content $TmpFilePath
# Now that you've got the new file you can rename it & delete the original:
Remove-Item -Path $CurrentFile.FullName
Rename-Item -Path $TmpFilePath -NewName $CurrentFile.Name
}
This is a streaming model, Get-Content is streaming inside the outer ForEach-Object loop. There may be other ways to do it, but I chose this so I could keep track of the names and do the file swap at the end...
Note: Per the same article, in terms of speed Get-Content is quite slow. However, your original code was likely already suffering that burden. Moreover, you can speed it up a bit using the -ReadCount XXXX parameter. That will send some number of lines down the pipe at a time. That of course does use more memory, so you'd have to find a level that helps you say within the boundaries of your available RAM. Performance improvement with -ReadCount is mentioned in this answer's comments.
Update Based on Comments:
Here's an example of using StreamReader/Writer to perform the same operations from the previous example. This should be just as memory efficient as Get-Content, but should be much faster.
Get-ChildItem -Path "C:\temp" -Depth 2 -Filter *.csv |
ForEach-Object{
$CurrentFile = $_.FullName
$CurrentName = $_.Name
$TmpFilePath = Join-Path $_.Directory.FullName ($_.BaseName + "_New" + $_.Extension)
$StreamReader = [System.IO.StreamReader]::new( $CurrentFile )
$StreamWriter = [System.IO.StreamWriter]::new( $TmpFilePath )
While( !$StreamReader.EndOfStream )
{
$StreamWriter.WriteLine( ($StreamReader.ReadLine() -replace "`0","") )
}
$StreamReader.Close()
$StreamWriter.Close()
# Now that you've got the new file you can rename it & delete the original:
Remove-Item -Path $CurrentFile
Rename-Item -Path $TmpFilePath -NewName $CurrentName
}
Note: I have some sense this issue is rooted in encoding. The Stream constructors do accept an encoding enum as an argument.
Available Encodings:
[System.Text.Encoding]::BigEndianUnicode
[System.Text.Encoding]::Default
[System.Text.Encoding]::Unicode
[System.Text.Encoding]::UTF32
[System.Text.Encoding]::UTF7
[System.Text.Encoding]::UTF8
So if you wanted to instantiate the streams with, for example, UTF8:
$StreamReader = [System.IO.StreamReader]::new( $CurrentFile, [System.Text.Encoding]::UTF8 )
$StreamWriter = [System.IO.StreamWriter]::new( $TmpFilePath, [System.Text.Encoding]::UTF8 )
The streams do default to UTF8. I think the system default is typically code page Windows 1251.
This would be the simplest way using the least memory, one line at a time, to another file. But it needs double the disk space.
get-content file.txt | % { $_ -replace "`0" } | set-content file2.txt
I am getting CSV files (with no header) from another system. The last line ends the file, (there is not a newline after the last line of data). When I try Import-CSV, it will not read the last line of the file.
I do not have the ability to have the input file changed to include the newline.
I have noticed that the Get-Content doesn't have a problem reading the entire file, but then it isn't a CSV and I'm unable to reference the fields in the file.
Currently I'm doing:
$w = Import-CSV -path c:\temp\input.txt -header 'head1', 'head2', 'head3'
This will not read the last line of the file
This reads the entire file:
$w = Get-Content -path c:\temp\input.txt
But the data doesn't have the ability to reference the fields like: $w.head1
Is there a way to get Import-CSV to read the file including the last line?
OR Is there a way to read in the data using Get-Content, adding a header to it and then converting it back to a CSV?
I've tried use ConvertTo-CSV but have not had success:
$w = Get-Content -path c:\temp\input.txt
$csvdata = $w | ConvertTo-CSV # No header option for this function
I'd rather not create an intermediate file unless absolutely necessary.
You're very close! What you're after is not ConvertTo-Csv, you already have the file contents in CSV-format after all. So change that to ConvertFrom-Csv instead, which incidentally does support the -Headers parameter. So something like this:
$w = Get-Content -path c:\temp\input.txt
$csvdata = $w | ConvertFrom-Csv -Header 'head1', 'head2', 'head3'
If I understand correctly, you know the number of columns in the file and all it is missing is a header line. Since in your code you do not specify a -Delimiter parameter I'm assuming the delimiter character used in the file is a comma.
Best thing to do IMHO is to create a new output file and always keep the original.
$fileIn = 'c:\temp\input.txt'
$fileOut = 'c:\temp\input.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value 'head1,head2,head3'
# read the original file and append it to the one you have just created
Get-Content -Path $fileIn -Raw | Add-Content -Path $fileOut
If your file is really large, below a faster alternative:
$fileIn = 'c:\temp\input.txt'
$fileOut = 'c:\temp\input.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value 'head1,head2,head3'
# read the original file and append it to the one you have just created
[System.IO.File]::AppendAllText($fileOut, ([System.IO.File]::ReadAllText($fileIn)))
If you really do want to take the risk and overwrite the original file, you can do this:
$file = 'c:\temp\input.txt'
$content = Get-Content -Path $fileIn -Raw
# write the header line to a the file destroying what was in there
Set-Content -Path $file -Value 'head1,head2,head3'
# append the original content to it
$content | Add-Content -Path $file
I want to preserve Emojis with Get-Content.
When I pull the string from the feed I get the following result:
$WebResponse = Invoke-RestMethod $website
$str_outputNAME = $feed.title
Wanna try😉?
But when I save the content of the file and append it after I have the following result:
$content = (Get-Content -Path $file) -join "`n"
$toWrite = $top_line+$toWrite+$content
$toWrite | Out-File -FilePath $file;
Wanna try???
Background-Info
I want to use Powershell to read a rss-feed.
Therefor I need to append a string at the start of my CSV-File on update.
Because my question was regarding *.csv files I found that a better way is too use
$content = Import-Csv -Path $file
instead of
$content = Get-Content -Path $file
Now all my Emojis are preserved within the file but the processing of the script takes twice the time now.
I tried all different possible Get-Content -Encoding arguments but without luck.
Always resulted in loss towards the formatting of emojis.