Convert bat file to powershell script - powershell

I have a simple .bat file that executes a exe file and also passes some parameters in txt file. How do i acheive the same in powershell script (.ps1 file) ?
.bat file content:
#echo on
C:\Windows\System32\cmd.exe /C "C:\Program Files\BMC Software\AtriumCore\cmdb\server64\bin\cmdbdiag.exe" -u test -p test -s remedyar -t 41900 < "C:\Program Files\BMC Software\ARSystem\diserver\data-integration\batch\CleanupInputs.txt" > "C:\Program Files\BMC Software\ARSystem\diserver\data-integration\batch\Snow_Output\DailyOutput.log"
Exit 0

Fundamentally, you invoke console applications the same way in PowerShell as you do in cmd.exe, but there are important differences:
# If you really want to emulate `#echo ON` - see comments below.
Set-PSDebug -Trace 1
# * PowerShell doesn't support `<` for *input* redirection, so you must
# use Get-Content to *pipe* a file's content to another command.
# * `>` for *output* redirection *is* supported, but beware encoding problems:
# * Windows PowerShell creates a "Unicode" (UTF-16LE) file,
# * PowerShell (Core, v6+) a BOM-less UTF-8 file.
# * To control the encoding, pipe to Out-File / Set-Content with -Encoding
# * For syntactic reasons, because your executable path is *quoted*, you must
# invoke it via `&`, the call operator.
Get-Content "C:\..\CleanupInputs.txt" |
& "C:\...\cmdbdiag.exe" -u test -p test -s remedyar -t 41900 > "C:\...\DailyOutput.log"
# Turn tracing back off.
Set-PSDebug -Trace 0
exit 0
Note:
For brevity I've replaced the long directory paths in your command with ...
Character-encoding caveats:
When PowerShell communicates with external programs, it only "speaks text" (and it generally never passes raw bytes through its pipelines (as of v7.2)), which therefore involves potentially multiple passes of encoding and decoding strings; specifically:
Get-Content doesn't just path a text file's raw bytes through, it decodes the content into .NET strings and then sends the content line by line through the pipeline. If the input file lacks a BOM, Windows PowerShell assumes the active ANSI encoding, whereas PowerShell (Core) 7+ assumes UTF-8; you can use the -Encoding parameter to specify the encoding explicitly.
Since the receiving command is an external program (executable), PowerShell (re)-encodes the lines before sending them to the program, based on the $OutputEncoding preference variable , which defaults to ASCII(!) in Windows PowerShell, and UTF-8 in PowerShell (Core) 7+.
Since > - effectively an alias for Out-File - is used to redirect the external program to a file, another round of decoding and encoding happens:
PowerShell first decodes the external program's output into .NET strings, based on the character encoding stored in [Console]::OutputEncoding, which defaults to the system's active OEM code page.
Then Out-File encodes the decoded strings based on its default encoding, which is UTF-16LE ("Unicode") in Windows PowerShell, and BOM-less UTF-8 in PowerShell (Core); to control the encoding, you need to use Out-File (or Set-Content) explicitly and use its -Encoding parameter.
See also:
about_Redirection
&, the call operator
This answer discusses default encodings in both PowerShell editions; the short of it: they vary wildly in Windows PowerShell, but PowerShell (Core) 7+ now consistently used BOM-less UTF-8.
Re execution tracing: use of #echo ON in your batch file and how it compares to PowerShell's
Set-PSDebug -Trace 1:
Batch files typically run with #echo OFF so as not to echo every command itself before printing its output.
#echo ON (or omitting an #echo ON/OFF statement altogether) can be helpful for diagnosing problems during execution, however.
Set-PSDebug -Trace 1 is similar to #echo ON, but it has one disadvantage: the raw source code of commands is echoed, which means that you won't see the value of embedded variable references and expressions - see this answer for more information.

Related

Powershell - Write-Output produces string with a BOM character

I'm trying to execute such command in a Powershell script:
Write-Output "Some Command" | some-application
When running PS script some-application receives string \xef\xbb\xbfSome Command. The first character is an UTF-8 BOM. All solutions that I can Google, apply only to redirecting output to a file. But I'm trying to redirect a string to another command (via pipe).
The variable $OutputEncoding shows that ASCII is configured, no UTF-8 is set.
I'm running this script from Azure DevOps Pipeline and only there this problem exists.
Note: This answer deals with how to control the encoding that PowerShell uses when data is piped to an external program (to be read via stdin by such a program).
This is separate from:
what character encoding PowerShell cmdlets use by default on output - for more information, see this answer.
what character encoding PowerShell uses when reading data received from external programs - for more information, see this answer.
The implication is that you've mistakenly set the $OutputEncoding preference variable, which determines what character encoding PowerShell uses to send data to external programs, to a UTF-8 encoding with a BOM.
Your problem goes away if you assign a BOM-less UTF-8 encoding instead:
$OutputEncoding = [System.Text.Utf8Encoding]::new($false) # BOM-less
"Some Command" | some-application
Note that this problem wouldn't arise in PowerShell [Core] v6+, where $OutputEncoding defaults to BOM-less UTF-8 (in Windows PowerShell, unfortunately, it defaults to ASCII).
To illustrate the problem:
$OutputEncoding = [System.Text.Encoding]::Utf8 # *with* BOM
"Some Command" | findstr .
The above outputs Some Command, where  is the rendering of the 3-byte UTF-8 BOM (0xef, 0xbb, 0xbf) in the OEM code page 437 (on US-English systems, for instance).

Broken Cmd scripts after creation with PowerShell Write-Output

We have a domain-wide automation tool that can start jobs on servers as admin (Stonebranch UAC - please note: this has nothing to do with Windows "User Access Control", Stronebranch UAC is an Enterprise automation tool). Natively, it looks for Cmd batch scripts, so we use those.
However, I prefer to use PowerShell for everything, so I bulk created dozens of .bat scripts using PowerShell. Nothing worked, the automation tool broke whenever it tried to run the .bat scripts. So I pared back the scripts so that they contained a single line echo 123 and still, everything was broken. We thought it was a problem with the tool, but then tried to run the .bat scripts on the server and they were broken too, just generating some unicode on the command line and failing to run.
So it dawned on us that something about how PowerShell pumps Write-Output commands to create the batch scripts was breaking them (this is on Windows 2012 R2 and PowerShell is 5.1). And I repeat this, for example, if I type the following on a PowerShell console:
Write-Output "echo 123" > test.bat
If I now open a cmd.exe and then try to run test.bat, I just get a splat of 2 unicode-looking characters on the screen and nothing else.
Can someone explain to me a) why this behaviour happens, and b) how can I continue to use PowerShell to generate these batch scripts without them being broken? i.e. do I have to change BOM or UTF-8 settings or whatever to get this working and how do I do that please?
In Windows PowerShell, >, like the underlying Out-File cmdlet, invariably[1] creates "Unicode" (UTF-16LE) files, which cmd.exe cannot read (not even with the /U switch).
In PowerShell [Core] v6+, BOM-less UTF-8 encoding is consistently used, including by >.
Therefore:
If you're using PowerShell [Core] v6+ AND the content of the batch file comprises ASCII-range characters only (7-bit range), you can get away with >.
Otherwise, use Set-Content with -Encoding Oem.
'#echo 123' | Set-Content -Encoding Oem test.bat
If your batch-file source code only ever contains ASCII-range characters (7-bit range), you can get also get away with (in both PowerShell editions):
'#echo 123' | Set-Content test.bat
Note:
As the -Encoding argument implies, the system's active OEM code page is used, which is what batch files expect.
OEM code pages are supersets of ASCII encoding, so a file saved with -Encoding Oem that is composed only of ASCII-range characters is implicitly also an ASCII file. The same applies to BOM-less UTF-8 and ANSI (Default) encoded files composed of ASCII-range characters only.
-Encoding Oem - as opposed to -Encoding Ascii or even using Set-Content's default encoding[2] - therefore only matters if you have non-ASCII-range characters in your batch file's source code, such as é. Such characters, however, are limited to a set of 256 characters in total, given that OEM code pages are fixed-width single-byte encodings, which means that many Unicode characters are inherently unusable, such as €.
[1] In Windows PowerShell v5.1 (and above), it is possible to change >'s encoding via the $PSDefaultParameterValues preference variable - see this answer - however, you won't be able to select a BOM-less UTF-8 encoding, which would be needed for creation of batch files (composed of ASCII-range characters only).
[2] Set-Content's default encoding is the active ANSI code page (Default) in Windows PowerShell (another ASCII superset), and (as for all cmdlets) BOM-less UTF-8 in PowerShell [Core] v6+; for an overview of the wildly inconsistent character encodings in Windows PowerShell, see this answer.
a method that by default creates bom-less utf8 files is:
new-item -Path outfile.bat -itemtype file -value "echo this"

Getting stdin into the Powershell stream

The following script works well when the filename is specified on the command line.
tail.bat
#echo off
set "COUNT=%1"
set "COUNT=%COUNT:-=%"
set "FILENAME=%~2"
powershell "Get-Content %FILENAME% -Last %COUNT%"
However, what I need is to be able to pipe the text into Get-Content from stdin. I would like to write the following to get the last three Subversion tags assigned to the project. What can I do to get the source to Get-Content to be stdin?
svn ls svn://ahost/arepo/aproject/tags | call tail.bat -3
NB: I am not permitted to install any helpful tools like tail from the outside. It has to be done with the programs already available on the machine.
Update:
#mklement0 provided the answer. From that, I added code to use a default COUNT value of 10 if it is not provided. This matches the UNIX/Linux way.
#echo off
SET "COUNT=%~1"
IF "%COUNT:~0,1%" == "-" (
SET "COUNT=%COUNT:~1%"
SHIFT
) ELSE (
SET "COUNT=10"
)
SET "FILENAME=%~1"
if "%FILENAME%" == "" (
powershell -noprofile -command "$Input | Select-Object -Last %COUNT%"
) else (
powershell -noprofile -command "Get-Content \"%FILENAME%\" -Last %COUNT%"
)
EXIT /B
Rewrite tail.bat as follows:
#echo off
set "COUNT=%1"
set "COUNT=%COUNT:-=%"
set "FILENAME=%~2"
if "%FILENAME%"=="" (
powershell -noprofile -command "$Input | Select-Object -Last %COUNT%"
) else (
powershell -noprofile -command "Get-Content \"%FILENAME%\" -Last %COUNT%"
)
This will make the PowerShell CLI read stdin input via the automatic $input variable, if no filename argument was passed, courtesy of this answer.
Example:
C:> (echo one & echo two & echo three) | tail.bat -2
two
three
Note:
While PowerShell generally sends through the pipeline and outputs objects of any kind, its interface to the outside world invariably involves strings.
Thus, given that $Input is an enumerator that represents outside stdin input, we can be sure that it enumerates the input text lines (as strings) one by one, so all we need is to select the lines of interest, which is why piping to Select-Object is sufficient.
By contrast, reading a file by name in PowerShell requires Get-Content (which, incidentally, also sends the input file's lines one by one through the pipeline, unless you also specify -Raw); since Get-Content has tail functionality built in, via parameter -Tail (and its alias -Last), it is all that is needed here.
CAVEAT: Character decoding on input and re-encoding on output is involved when PowerShell talks to the outside world:
If you're only ever dealing with ASCII-encoded input (single-byte characters with code points ranging between 0 - 127), you needn't worry.
Otherwise, prepare for a world of pain - see below for details.
Character decoding/re-encoding issues:
Assuming that PowerShell recognizes your input encoding (see below), the output encoding is invariably what the console window's assigned encoding is; by default, unfortunately, that is the OEM codepage (e.g., the "DOS" code page CP437 on US-English systems), reflected in PS as [Console]::OutputEncoding.
Thus, with properly recognized input, if you print to the console, things will look OK, but if you capture the output in a file, you'll end up with an OEM-codepage-encoded file, which is probably undesired.
If feasible, you could fundamentally set up your console windows to use your codepage (input and output encoding) of choice (using chcp), but trying to change the encoding ad-hoc in your script is, unfortunately, not an option.
Note that using UTF-8 - codepage 65001 - only works if you configure your console windows to use one of the TT (TrueType) fonts.
As written above, the set of input encodings that are properly recognized is unfortunately limited to the following, based on the default input encoding (which is also the OEM codepage, reflected in PS as [Console]::InputEncoding; remember: input will be re-encoded on output):
ASCII input (re-encoding on output will by default preserve this encoding)
UTF-16 LE input with a BOM (which is what PowerShell calls Unicode, subject to re-encoding to something potentially different on output)
You could hard-code an expected input encoding by adding -Encoding <enc> to the Get-Content call (which expects the Windows default codepage encoding by default), but to do the same for stdin input (as reflected in $Input) would be non-trivial.
E.g., with the default input encoding, if you explicitly wanted to interpret the input as UTF-8 (again, note that on output [Console]::OutputEncoding encoding is applied):
powershell -noprofile -command "$Input | % { [text.encoding]::utf8.GetString([Console]::InputEncoding.GetBytes($_)) } | Select-Object -Last %COUNT%"

How to cat a UTF-8 (no BOM) file properly/globally in PowerShell? (to another file)

Create a file utf8.txt. Ensure the encoding is UTF-8 (no BOM). Set its content to €
In cmd.exe:
type utf8.txt > out.txt
Content of out.txt is €
In PowerShell (v4):
cat .\utf8.txt > out.txt
or
type .\utf8.txt > out.txt
Out.txt content is €
How do I globally make PowerShell work correctly?
Note: This answer is about Windows PowerShell (up to v5.1); PowerShell [Core, v6+], the cross-platform edition of PowerShell, now fortunately defaults to BOM-less UTF-8 on both in- and output.
Windows PowerShell, unlike the underlying .NET Framework[1]
, uses the following defaults:
on input: files without a BOM (byte-order mark) are assumed to be in the system's default encoding, which is the legacy Windows code page ("ANSI" code page: the active, culture-specific single-byte encoding, as configured via Control Panel).
on output: the > and >> redirection operators produce UTF-16 LE files by default (which do have - and need - a BOM).
File-consuming and -producing cmdlets do usually support an -Encoding parameter that lets you specify the encoding explicitly.
Prior to Windows PowerShell v5.1, using the underlying Out-File cmdlet explicitly was the only way to change the encoding.
In Windows PowerShell v5.1+, > and >> became effective aliases of Out-File, allowing you to change the encoding behavior of > and >> via the $PSDefaultParameterValues preference variable; e.g.:
$PSDefaultParameterValues['Out-File:Encoding'] = 'utf8'.
For Windows PowerShell to handle UTF-8 properly, you must specify it as both the input and output encoding[2]
, but note that on output, PowerShell invariably adds a BOM to UTF-8 files.
Applied to your example:
Get-Content -Encoding utf8 .\utf8.txt | Out-File -Encoding utf8 out.txt
To create a UTF-8 file without a BOM in PowerShell, see this answer of mine.
[1] .NET Framework uses (BOM-less) UTF-8 by default, both for in- and output.
This - intentional - difference in behavior between Windows PowerShell and the framework it is built on is unusual. The difference went away in PowerShell [Core] v6+: both .NET [Core] and PowerShell [Core] default to BOM-less UTF-8.
[2] Get-Content does, however, automatically recognize UTF-8 files with a BOM.
For PowerShell 5.1, enable this setting:
Control Panel, Region, Administrative, Change system locale, Use Unicode UTF-8
for worldwide language support
Then enter this into PowerShell:
$PSDefaultParameterValues['*:Encoding'] = 'Default'
Alternatively, you can upgrade to PowerShell 6 or higher.
https://github.com/PowerShell/PowerShell

Execute batch file in Powershell

I want to execute the following from a batch file:
"C:\OpenCover\tools\OpenCover.Console.exe" -register:user -target:"%VS110COMNTOOLS%..\IDE\mstest.exe" -targetargs:"/testcontainer:\"C:\Develop\bin\Debug\MyUnitTests.dll\" ... "
PAUSE
Now I would like to log the output of the process to a file for which I came across the quite handy powershell usage of
powershell "dir | tee output.log"
but this does not take my batch file as first argument (powershell "my.bat | tee output.log") because it is not the name of a cmdlet or a function or a script file.
I could change my batch file so that is says powershell "OpenCover.Console.exe..." but I would have to adapt all quotes and change escape characters and so forth.
Is there a way to make a batch file execute in powershell? Or is there a way to drop in my line unchanged from the batch after some powershell command and it all executes "like it ought to"?
Unless your batch file is in a folder in the %PATH%, PowerShell won't find it [1], so you'll have to supply an explicit file path (whether relative or absolute).
For instance, if the batch file is in the current folder, run:
powershell -c ".\my.bat | tee output.log"
Consider adding -noprofile to suppress loading of the profile files, which is typically only needed in interactive sessions.
If your batch file path contains embedded spaces, enclose it in single quotes and prepend &:
powershell -c "& '.\my script.bat' | tee output.log"
Note: I've deliberately added the -c (short for: -Command) parameter name above; while powershell.exe - Windows PowerShell - defaults to this parameter, that is no longer true in PowerShell [Core] v6+ (whose executable name is pwsh), where -File is now the default - see about_PowerShell.exe and about_pwsh
[1] More accurately, PowerShell - unlike cmd.exe - will by design not execute scripts in the current folder by their mere filename (in an interactive PowerShell session you'll get a hint to that effect). This is a security feature designed to prevent accidental invocation of a different executable than intended.
Unless you have some purpose for doing so not stated in the OP, there isn't a reason to use both Powershell and a batch script. If you want to do this solely from PS, you can create a PS script that does everything the batch file does.
To avoid the escaping issues (or alternatively to take advantage of CMD.EXE's somewhat strange escaping behavior :-) you can use --%, introduced in PS 3.0. It is documented under about_escape_characters.