Powershell logging from Invoke-Expression with encoding - powershell

I have an specific scenario where I have to log a batch file using Invoke-Expression in Powershell but my logs are being saved with "UCS-2 Little Endian" Encoding and I would like to save it with UTF-8 or any other encoding.
This is a simple example of what I'm trying to do:
batch file (test.bat):
echo Test
Powershell file (test.ps1):
Invoke-Expression "c:\test.bat > log.txt"
Is there a way I could change the encoding on log.txt?

You can try this:
C:\test.bat | Out-File C:\log.txt -Encoding UTF8
Or if for whatever reason you really have to use Invoke-Expression:
Invoke-Expression "C:\test.bat" | Out-File C:\log.txt -Encoding UTF8
Note that this will overwrite log.txt everytime. If you want to append to the file do this:
Invoke-Expression "C:\test.bat" | Out-File C:\log.txt -Encoding UTF8 -append
or
Invoke-Expression "C:\test.bat" | Add-Content C:\log.txt -Encoding UTF8

Related

Syntax error in powershell command in windows

I try to run two commands in a bat file using the powershell. My goal is to transform a file to a utf8 format. How can I achieve that?
Here is what I have so far:
PowerShell -Command (Get-Content 'ZipCode.csv' | Out-File 'ZipCode1.csv' -Encoding utf8)
I get the following error: "out-file is not recognized as an internal or external command"
The doublequotes seem sufficient to escape the pipe. Single quotes on the outside wouldn't work.
PowerShell "Get-Content ZipCode.csv | Out-File ZipCode1.csv -Encoding utf8"
If you're only using Out-File because your version of PowerShell doesn't include the -Encoding option with Set-Content, then it should read:
#"%__AppDir__%WindowsPowerShell\v1.0\powershell.exe" -NoProfile -Command "Get-Content -Path '.\ZipCode.csv' | Out-File -FilePath '.\ZipCode1.csv' -Encoding UTF8"
Obviously if you have a Version of PowerShell where Set-Content has the required -Encoding option, use it instead:
#"%__AppDir__%WindowsPowerShell\v1.0\powershell.exe" -NoProfile -Command "Get-Content -LiteralPath 'ZipCode.csv' | Set-Content -LiteralPath 'ZipCode1.csv' -Encoding UTF8"
These could obviously be shortened to remove the robustness and use aliases/shorthand:
#PowerShell -NoP "GC '.\ZipCode.csv'|Out-File '.\ZipCode1.csv' -E UTF8"
#PowerShell -NoP "GC -LP 'ZipCode.csv'|SC -LP 'ZipCode1.csv' -En UTF8"
I prefer to use -LiteralPath because I have a tendency to use [] in my file naming, and those can be problematic in filenames. Change the output file name to ZipCode[1], then try the -Set-Content version code with -Path or nothing instead of -LiteralPath/-LP option, and you should see what I mean.

Modify a JSON file with PowerShell without writing BOM

I need to modify an existing UTF8 encoded JSON file with PowerShell. I tried with the following code:
$fileContent = ConvertFrom-Json "$(Get-Content $filePath -Encoding UTF8)"
$fileContent.someProperty = "someValue"
$fileContent | ConvertTo-Json -Depth 999 | Out-File $filePath
This adds a BOM to the file and also encodes it in UTF16. Is it possible to have ConvertFrom-Json and ConvertTo-Json do not do the encoding / BOM?
This has nothing to do with ConvertTo-Json or ConvertFrom-Json. The encoding is defined by the output cmdlet. Out-File defaults to Unicode, Set-Content to ASCII. With each of them the desired encoding can be defined explicitly:
... | Out-File $filePath -Encoding UTF8
or
... | Set-Content $filePath -Encoding UTF8
That will still write a (UTF8) BOM to the output file, but I wouldn't consider UTF-8 encoding without BOM a good practice anyway.
If you want ASCII-encoded output files (no BOM) replace UTF8 with Ascii:
... | Out-File $filePath -Encoding Ascii
or
... | Set-Content $filePath # Ascii is default encoding for Set-Content

Upload JES job using Powershell and FTP and download the response

I'm using this script to upload a JOB on the Mainframe.
"open $FTPserver" | Out-File ftp.scr -Encoding ASCII
$FTPusername | Out-File ftp.scr -Encoding ASCII -Append
$FTPpassword | Out-File ftp.scr -Encoding ASCII -Append
"quote site filetype=jes" | Out-File ftp.scr -Encoding ASCII -Append
"put " + $FTPfile | Out-File ftp.scr -Encoding ASCII -Append
"quit" | Out-File ftp.scr -Encoding ASCII -Append
ftp.exe -s:ftp.scr
Remove-Item ftp.scr
Script works great. But I would like to save this response to variable so I can download in the next step this JOB for the response. This is my targeted response:
250-It is known to JES as JOB24503
Is there a way also to hide the output text from the FTP?
To redirect output to a file use the > operator.
ftp.exe -s:ftp.scr > ftp.out
I'm not a powershell guru but I do know that you can use the full .NET API which gives you access to a very sophiticated runtime. Check out FtpWebRequest which can be used to run FTP requests and pipe the output back to streams.

tee with utf-8 encoding

I'm trying to tee a server's output to both the console and a file in Powershell 4. The file is ending up with a UTF-16 encoding, which is incompatible with some other tools I'm using. According to help tee -full:
Tee-Object uses Unicode enocding when it writes to files.
...
To specify the encoding, use the Out-File cmdlet
So tee doesn't support changing encoding, and the help for both tee and Out-File don't show any examples of splitting a stream and encoding it with UTF-8.
Is there a simple way in Powershell 4 to tee (or otherwise split a stream) to a file with UTF-8 encoding?
One option is to use Add-Content or Set-Content instead of Out-File.
The *-Content cmdlets use ASCII encoding by default, and have a -Passthru switch so you can write to the file, and then have the input pass through to the console:
Get-Childitem -Name | Set-Content file.txt -Passthru
You would have to use -Variable and then write it out to a file in a separate step.
$data = $null
Get-Process | Tee-Object -Variable data
$data | Out-File -Path $path -Encoding Utf8
At first glance it seems like it's easier to avoid tee altogether and just capture the output in a variable, then write it to the screen and to a file.
But because of the way the pipeline works, this method allows for a long running pipeline to display data on screen as it goes along. Unfortunately the same cannot be said for the file, which won't be written until afterwards.
Doing Both
An alternative is to roll your own tee so to speak:
[String]::Empty | Out-File -Path $path # initialize the file since we're appending later
Get-Process | ForEach-Object {
$_ | Out-File $path -Append -Encoding Utf
$_
}
That will write to the file and back to the pipeline, and it will happen as it goes along. It's probably quite slow though.
Tee-object seems to invoke out-file, so this will make tee output utf8:
$PSDefaultParameterValues = #{'Out-File:Encoding' = 'utf8'}
First create the file using appropriate flags then append to it:
Set-Content out $null -Encoding Unicode
...
cmd1 | tee out -Append
...
cmdn | tee out -Append

UTF8 encoding without BOM - PowerShell

I have a bat file where I encode some CSV files. The problem is that there are one character at the begining of the file once the encoding have been done (BOM byte I guess). This character bothers me cause after encoding, I use this file to generate a database.
Here is the line for encoding (inside bat file):
powershell -Command "&{ param($Path); (Get-Content $Path) | Out-File $Path -Encoding UTF8 }" CSVs\\pass.csv
Is there any way to encode the file without BOM (if this is the problem)??
Thanks!
I found the solution.
Just change the line with this:
powershell -Command "&{ param($Path); $Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding($False); $MyFile = Get-Content $Path; [System.IO.File]::WriteAllLines($Path, $MyFile, $Utf8NoBomEncoding) }" CSVs\\pass.csv