Powershell generates .bat, and put special character - powershell

I'm currently working with powershell in order to create a .bat script.
I put text in .bat script with >>
For example,
Write "start program xxx" >> script.bat
but when i try to execute this script.bat with cmd, it says :
"■s" is not recognize ... etc.
And in powershell it says : 'þp' is not recognize ..
So I guess that doing >> script put special character at the beginning of the line. If someone got information on this. And what those "■s" and 'þp' are.

The file redirection operators (>> etc.) will write text encoded in UTF-16. If the file already contains text in a different encoding everything will be confused (and I'm not use of cmd.exe understands UTF-16 at all.
Easier to use Out-File with the -encoding parameter to specify something consistent. Use the -append switch parameter to append rather than overwriting.
Eg.
"Some text" | Out-File -encoding ASCII -append -FilePath 'script.bat`
(If you find yourself writing the same out-file and parameters, then put it in a helper advanced function that will read pipeline input to encapsulate the out-file.)

Related

Powershell: Out-Printer, unicode and font sizing

powershell "get-content foo.txt|Out-Printer"
As long as foo.txt is english, everything is fine (well mostly)
If foo.txt contains unicode characters e.g. राष्ट्र then what gets printed is stuff like °à¥€ ब
I tried passing the -Encoding option to get-content but it did not change the result.
Is it possible ensure that unicode text gets printed properly without
launching Word/IE etc in the background to print it?
My second question is
Is it possible to control which font (type and size) is used for
printing by out-printer?
In my environment (Windows 10 2004 Build 19041.985, Japanese locale), I got the correct result with the following situation:
Save the .txt file in UTF-8 with BOM, and print with Get-Content .\foo.txt | Out-Printer
Save the .txt file in UTF-8 without BOM, and print with Get-Content .\foo.txt -Encoding UTF8 | Out-Printer
I got the incorrect result (like 爨ー爨セ爨キ爭財、游・財、ー) with the following situation:
Save the .txt file in UTF-8 without BOM, and print with Get-Content .\foo.txt | Out-Printer
So it looks like an encoding problem. Please check what #RavenKnit said first.
Is it possible ensure that unicode text gets printed properly without launching Word/IE etc in the background to print it?
I couldn't find the way to do with the Get-PrinterPort, Get-WmiObject Win32_Printer, prnmngr.vbs or prnqctl.vbs. If you just don't want to show a Windows while you print the content of a file, you can use the Print verb. It runs notepad.exe for .txt files, winword.exe for .docx files, etc.
Start-Process .\foo.txt -Verb Print -WindowStyle Hidden -Wait
Is it possible to control which font (type and size) is used for printing by out-printer?
According to the source of Out-Printer, the default font is embbeded in the .resx file. So it looks like you cannot control the default font.

Powershell Out-file special characters

I have a script that processes data from files and writes result based on a condition to txt. Given data are strings with words like: "Distribución" or "México". When processed, those special characters like "é" and "ó" are broken (typical white square or question mark).
How can i encode the output file to make it work with those characters? I tried encoding in Utf8, utf8 without BOM, it doesn't work. Here is to file writing line:
...| Out-file -encoding XXX .\result.txt
in XXX i tried ASCII, Utf8, nothing works :/
Out-File will always add a BOM. It's a particularly annoying "feature" of that Cmdlet. Unfortunately - to my knowledge - there is no quick way to save a file using UTF8 WITHOUT a BOM in powershell. You can, however, leverage .Net to do this. This isn't really production ready, but here's a quick example:
$outputPath = "D:\temp.txt"
$data = "Distribución or México"
[System.IO.File]::WriteAllLines($outputPath, $data)
Wrap it in a Cmdlet, function and / or module to make it reusable. Of course you can take more control over the file encoding with .Net too.

Multiple powershell scripts outputting to the same text file

I have a set of ps scripts, i will call them parents, that invoke other ps scripts, i will call them children. They all need to write to the same text file, i will call myoutfile.txt. The first output of the parent script should clear myoutputfile.txt. All subsequent child ps scripts should append to myoutputfile.txt, including another parent script later in the logic.
Using Out-File, the parent PS script opens and locks the file myoutputfile.txt. The children PS scripts need append to append to the file, but because the parent created the text file the children PS scripts fail silently to append to the text file.
I have tried Out-File and Add-Content. Add-Content puts out Chinese to the text file.
Either Out-File with append or Add-Content would work.
Unless you are actually trying to write Chinese, the encoding could be wrong.
$File ="PathTo\test.txt"
Get-Date | Out-File $File -Encoding ascii
"TEXT" | Out-File $File -Append -Encoding utf32
Add-Content -Path $File -Value "END" -Encoding Unicode
As you can see in the file, the result varies depending on encoding.
If you're getting Chinese, my guess is bytes.
Likely conversion problems earlier on.
It took a combo of Peter Schneider's comment and Martin van Delft's answer to get it working.

Cannot write help text to a file in PowerShell

I was trying to write help text to a file with
Set-Content -path "help.txt" -Value $(help -Full "help")
Then I found that help cmdlet generates an object rather than text.
But simply adding toString() at the end does not work either.
So how can I get clean text from help command and write it to file using Set-Content?
In order to capture output as it would print on the screen, use either output redirection operator >, or pipe to cmdlet Out-File, which is required if you want to use an output character encoding other than the default, UTF-16 LE:
help -full help > help.txt # invariably creates a UTF-16 LE file
help -full help | Out-File help.txt # equivalent, but supports -Encoding <name>
By contrast, Set-Content:
does not use PowerShell's default output formatting; instead, it applies (at least conceptually) a .ToString() call to each input object, which may or may not give a meaningful representation.
creates ASCII files by default, but, like Out-File, it supports different encodings via the
-Encoding parameter.

Override Powershell > shortcut

In Powershell using > is the same as using | Out-File, so I can write
"something" > file.txt and It will write 'something' into file.txt . This is what I expect of a shell. Unfortunately, Powershell uses Unicode for writing file.txt. The only way to change it into UTF-8 is to write the quite long command:
"something" | Out-File file.txt -Encoding UTF8
I want to override the > shortcut, so that it adds the UTF-8 encoding by default. Is there a way to do that?
NOT A DUPLICATE CLARIFICATION:
This is not a duplicate. As is explained clearly here, Out-File has a hard-coded default. I don't want to change Out-File's behavior, I want to change >'s behavior.
No, can't be done
Even the documentation alludes to this.
From the last paragraph of Get-Help about_Redirection:
When you are
writing to files, the redirection operators use Unicode encoding. If
the file has a different encoding, the output might not be formatted
correctly. To redirect content to non-Unicode files, use the Out-File
cmdlet with its Encoding parameter.
(emphasis added)
The output encoding can be overriden by changing the $OutputEncoding variable. However, that only works for piping output into executables. It doesn't work for redirection operators. If you need a specific encoding for file output you must use Out-File or Set-Content with the -Encoding parameter (or a StreamWriter).