How to run a large base64 encoded file via powershell - powershell

I have a powershell.ps1 script that I performed base64 encoding upon as below
$Base64 = [System.Convert]::ToBase64String([System.IO.File]::ReadAllBytes('c:\path\to\powershell.ps1'));
Now I have stored this output to base64.txt file.
I tried to launch this script as below via CMD,
powershell.exe -EncodedCommand (Base64String)
But I ended up in the below error
Cannot process the command because the value specified with -EncodedCommand is not properly encoded. The value must be Base64 encoded.
I realized that the CMD is not taking the entire (Base64String). The full length of my (Base64String) is 11,133 characters. But CMD is accepting only 8160 characters.
Is there any way or workaround to run this base64 encoding?
Thanks in advance.

This worked for me (myscript.ps1 contains the base64 encoded command):
powershell -encodedcommand (Get-Content 'myscript.ps1' -Raw)
Which is very similar to what you would do in Bash:
$ powershell -encodedcommand `cat myscript.ps1`
Obs: Addressing some comments, this is sometimes indeed needed. My particular use case was to do a reverse shell while dodging an AV on a windows machine that was detecting my plaintext shell code.

Related

Running a self decrypting base64 powershelll script locally with powershell -file /path/to/ps1

I want to keep my powershell scripts on my local server in base64 but when run from schtasks or locally using powershell -file /path/to/ps1 they self decode. Is this possible??
I tried:
function Decode { $data = 'base 64 script'
[System.Text.Encoding]
::ASCII.GetString([System.Convert]::FromBase64String($data))}
Decode
This does not work. Any ideas?
I see at least two options for this situation. One option is to send the base64 encoded command to Powershell.exe using the -EncodedCommand parameter. The second option is to create your decoding script to accept another script that contains the base64 encoded command as a parameter value.
Option 1: Passing the Encoded Command
This assumes your base64 encoded command is a string version of your PowerShell commands formatted using UTF-16LE character encoding (Unicode). Let's also assume that you have a script called Encoded.ps1 that contains your base64 encoded command. With the prerequisites met, you can do the following:
Powershell.exe -EncodedCommand (Get-Content Encoded.ps1)
Option 2: Running a Decode Script Against the Encoded Script
The unicode requirement does not matter in this case (you can use ANSI if you like). You just need to know your original command string encoding so you can properly decode it. We will assume ASCII character set. Let's also assume that Encoded.ps1 contains your base64 encoded command.
First, let's create the decode script called Decode.ps1.
# Decode.ps1
param([string]$FilePath)
$64EncodedData = Get-Content $FilePath
$DecodedData = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($64EncodedData))
& ([scriptblock]::Create($DecodedData))
Second, let's run the Powershell.exe command to decode Encoded.ps1 and execute the decoded command.
Powershell.exe -File Decoded.ps1 -FilePath Encoded.ps1
The code above is not intended to display the contents of the decoded commands but rather execute the decoded commands. $FilePath is the path to your Encoded.ps1 file, which contains a base64 encoded string from an ASCII encoded character set. You can change to whichever encoding applies to your situation in the Decode.ps1 file. $DecodedData contains the original command strings. Finally, a script block is created containing $DecodedData and then called with the call operator &.

powershell base64 encoding different results

below are the two commands for same purpose of doing a base64 encoding of a credtial.
from windows commandline :
powershell "[convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes(\"ATSxxx0101:urSY13sm\"))"
result QVRTVFNHMDEwMTp1clNZMTNzbQ==
from powershell :
[Convert]::ToBase64String([System.Text.Encoding]::Unicode.GetBytes('ATSxxx0101:urSY13sm'))
result : QQBUAFMAVABTAEcAMAAxADAAMQA6AHUAcgBTAFkAMQAzAHMAbQA=
result from windows command line , is working fine but the result powershell is wrong . but my tool can accept only the powershell command . direct windows command is not working. any idea experts ?
The reason is that [Text.Encoding]::UTF8 is not the same as [System.Text.Encoding]::Unicode. The difference is not missing System but UTF/Unicode.
Encoding.Unicode gets an encoding for the UTF-16 format using the little endian byte order.
Encoding.UTF8 gets an encoding for the UTF-8 format.

Encoding lost on save [duplicate]

I'm trying to capture standard output from npm that I run from PowerShell. Npm downloads packages that are missing and outputs appropriate tree.
What it looks like:
.
That's correct output.
When I try to do the same from PowerShell and capture the result, I'm not able to get the same characters:
It's the same when I use
gpm install | Tee-Object -FilePath or
gpm install | Out-File -FilePath .. -Encoding Unicode # or Utf8
$a = gpm install
When I redircect in cmd.exe output to a file, the content looks like this:
How can I capture the output correctly in PowerShell?
PowerShell is an object-based shell with an object-based pipeline, but for native applications the pipeline is byte-stream-based. So PowerShell has to convert from/to a byte stream when it passes data from/to a native application. This conversion happens even when you pipe data from one native application to another or redirect a native application's output to a file.
When PowerShell receives data from a native application, it decodes the byte stream as a string, and splits that string by the newline character. For decoding byte streams to strings PowerShell uses the console's output encoding: [Console]::OutputEncoding. If you know that your application use a different output encoding, you can explicitly change the console's output encoding to match your application's:
[Console]::OutputEncoding=[Text.Encoding]::UTF8
When PowerShell passes data to a native application, it convert objects to strings using the encoding specified in the $OutputEncoding preference variable.

Storing standard output from native app with Utf8 characters

I'm trying to capture standard output from npm that I run from PowerShell. Npm downloads packages that are missing and outputs appropriate tree.
What it looks like:
.
That's correct output.
When I try to do the same from PowerShell and capture the result, I'm not able to get the same characters:
It's the same when I use
gpm install | Tee-Object -FilePath or
gpm install | Out-File -FilePath .. -Encoding Unicode # or Utf8
$a = gpm install
When I redircect in cmd.exe output to a file, the content looks like this:
How can I capture the output correctly in PowerShell?
PowerShell is an object-based shell with an object-based pipeline, but for native applications the pipeline is byte-stream-based. So PowerShell has to convert from/to a byte stream when it passes data from/to a native application. This conversion happens even when you pipe data from one native application to another or redirect a native application's output to a file.
When PowerShell receives data from a native application, it decodes the byte stream as a string, and splits that string by the newline character. For decoding byte streams to strings PowerShell uses the console's output encoding: [Console]::OutputEncoding. If you know that your application use a different output encoding, you can explicitly change the console's output encoding to match your application's:
[Console]::OutputEncoding=[Text.Encoding]::UTF8
When PowerShell passes data to a native application, it convert objects to strings using the encoding specified in the $OutputEncoding preference variable.

Set encoding in curl command in powershell

I have curl command stored in variable ($command). The command looks like this:
curl -F "login=xxx" -F "password=xxx" "title=Some title" -F "img=#/path/to/image" https://example.com/api/import
Then i execute the command:
Invoke-Expression ($command)
Everything is fine unless title contains special characters like "č,š,ý..." because server expects UTF8 encoded parameters. In such case special characters are replaced with questionmarks on the website.
I tried setting [Console]::OutputEncoding and $OutputEncoding to UTF8 but it didn't solve the problem.When i run the command on linux (ubuntu) everything is fine because it uses UTF8 as default encoding, so i rewrote the script to bash to get the job done. But i'm still wondering if it's possible in powershell somehow. Any suggestions appreciated.
Settting [Console]::OutputEncoding works for me. Are you sure you're setting it correctly?
C:\PS> [console]::OutputEncoding = [text.encoding]::UTF8
C:\PS> echoargs "title=č,š,ý"
Arg 0 is <title=č,š,ý>
Command line:
"C:\Program Files (x86)\PowerShell Community Extensions\Pscx3\Pscx\Apps\EchoArgs.exe" title=č,š,ý
Echoargs is a tool from PSCX.