My command line is this (powershell):
$7z ="`"c:\Program Files\7-Zip\7z.exe`""
&$7z a -r -ttar -bd -so . | &$7z a -r -txz -bd $archive -si
The produced archive file indeed contains a tar file, but that tar file is corrupt.
Note, that breaking the pipe into two commands works correctly:
&$7z a -r -ttar -bd ${archive}.tmp .
&$7z a -r -txz -bd $archive ${archive}.tmp
The produced archive is perfectly valid.
So, what is wrong with my pipeline?
(I am using Powershell)
Nothing is wrong with your pipeline it is the way that the pipeline works that's causing the error.
PowerShell pipe works in an asynchronous way. Meaning that output of the first command is available to the second command immediately one object at the time even if the first one has not finished executing, See here.
Both Unix and PowerShell pipes operate in the same way. The reason why you might be seeing a difference from Unix to PowerShell is the way in which they go about it is different.
Unix passes Strings between the commands. Where as a Powershell pipe will pass full-fledged .net object between commands. This difference in the data type being past between command will be why it works on unix and not in PowerShell. If 7z.exe can not huddle these .net objects correctly the files will be come corrupt, See here.
Try adding | %{ "$_" } in between the pipes like
&$7z a -r -ttar -bd -so . | %{ "$_" } | &$7z a -r -txz -bd $archive -si
The point is that the second call to 7z expects unmodified data on STDIN, but PowerShell is converting the output from the first call to 7z to (multiple) (string) objects. % is an alias for foreach-object, so what the additional command does is to loop over each object and convert it to a plain string before passing it on to the second call to 7z.
Edit: Reading through PowerShell’s Object Pipeline Corrupts Piped Binary Data it looks to me now as if my suggestion would not work, and there's also no way to fix it. Well, other than wrapping the whole pipeline into a cmd /c "..." call to make cmd and not PowerShell handle the pipeline.
Edit2: I also was trying this solution from the PowerShell Cookbook, but it was very slow.
In the end, I created a .cmd script with the 7z pipes that I'm calling from my PowerShell script.
Related
I'm using Kalles' Fraktaler on Windows 10 to render images of the Mandelbrot set. Bundled with KF is a program to take a single parameter file and beak it into multiple tiles for easier rendering.
The output for the tiling program is multiple files with the following naming scheme: name-0000-0000.kfr, name-0000-0000.kfs, where the name can be anything and the numbers increment as needed.
The .kfr files are the parameter files.
The .kfs files are the settings files.
After I have these generated parameter and setting files, I can execute KF on the command line with the following arguments:
kf.exe -s name-0000-0000.kfs -l name-0000-0000.kfr -p name-0000-0000.png
Doing this for every pair of parameter and setting files works perfectly fine, taking the input files and saving the render to name-0000-0000.png
I asked the developer for an example PowerShell script to automate the process for when there are dozens or more of the files that need to be rendered, and this is what he gave me. The script needs to be run from the same directory as the files are stored.
Get-ChildItem "." -Filter *.kfr |
Foreach-Object {
$kfr = $_.FullName
$kfs = $kfr.replace("kfr", "kfs")
$png = $kfr.replace("kfr", "png")
C:/path/to/kf.exe -s $kfs -l $kfr -p $png
}
Unfortunately, I've tried every variation of this script that I could think of, and nothing gives me any results. I have already allowed unsigned scripts to be run on my computer. I would greatly appreciate some help on this.
(PowerShell is nice and flexible - but only when you use it to invoke only PowerShell commands rather than running native executables. For example, to run a program in the current directory you need to prefix the program's name with ./ - ostensibly this is done for safety and I assume for similarity to Unix shells, but it's the first in a long list of gotchas for anyone wanting to use PowerShell for tasks that would be trivial in old-school batch files)
Anyway, you need to use Invoke-Command or Start-Process.
I've changed your script from using a piped expression into an easier-to-digest loop (and invoking .NET's Path.ChangeExtension directly because PowerShell's built-in string match-and-replace syntax is too arcane for me):
$kfrFiles = Get-ChildItem "." -Filter "*.kfr"
foreach ( $kfrFile in $kfrFiles ) {
$kfr = $kfrFile.Name
$kfs = [System.IO.Path]::ChangeExtension( $kfrFile.Name, "kfs" )
$png = [System.IO.Path]::ChangeExtension( $kfrFile.Name, "png" )
Start-Process -FilePath "C:\path\to\kfs.exe" -ArgumentList "-s $kfs", "-l $kf", "-p $png" -Wait
}
The -Wait option will wait for the kfs.exe program to finish before starting the next instance - otherwise if you have hundreds of .kfr files then you'll end-up with hundreds of kfr processes running concurrently.
I don't know how to allow concurrent processes but impose a limit on the maximum-number of concurrent processes in PowerShell. It is possible, just complicated.
I have got a robust script which gets, parse and uses some data from .csv file. To run the script I can use
.\script.ps1 -d data_file.csv
The thing is I cannot interfere into script itself that is why i need to create some kind of a wrapper which will create new csv file and use script.ps1 with a new made file. I am wondering if there is a possibility to create a csv file as an object which will be passed directly to the command like this
.\script.ps1 -d csv_file_as_object.csv
without creating file in some path directory.
What you'd need in this case is the equivalent of Bash's process substitution (<(...)), which, in a nutshell, would allow you to present a command's output as the content of a temporary file whose path is output:
.\scripts.ps1 -d <(... | ConvertTo-Csv) # !! does NOT work in PowerShell
Note: ... | ConverTo-Csv stands for whatever command is needed to transform the original CSV in-memory.
No such feature exists in PowerShell as of Windows PowerShell v5.1 / PowerShell Core v6.1, but it has been proposed.
If .\scripts.ps1 happens to also accept stdin input (via pseudo-path - indicating stdin input), you could try:
... | ConvertTo-Csv | .\script.ps1 -d -
Otherwise, your only option is to:
save your modified CSV data to a temporary file
pass that temporary file's path to .\script.ps1
remove the temporary file.
I'm wanting to pass arbitrary scripts to Powershell via stdin.
(In practice, I'd like to avoid having to put the script into a temporary file, but for the purposes of this question I will pipe the contents of a file to powershell.)
So I'm doing something like so (in this example, from a Windows cmd shell):
type myfile.txt | powershell -
It works if myfile.txt contains something like this:
1..3 | % { $_ *2 }
echo done
(It outputs 2\n4\n6\ndone.)
However, if I split this first statement across multiple lines like so, then Powershell simply exists without generating any output at all:
1..3 |
% { $_ *2 }
echo done
This seems to fail for any multiline statement. For example, this also fails to produce output:
1..3 | % {
$_ *2 }
echo done
I'm surprised by this since each are legal Powershell scripts that would work normally if placed into a .ps1 file and run as normal.
I've tried various things including escaping the EOL using line continuation chars, to no avail. The same effect occurs if the parent shell is Powershell, or even Python (using subprocess.Popen with stdin=PIPE). In each case, Powershell exits without any error, and the exit code is 0.
Interestingly, if I run the following, only "before.txt" gets created.
"before" | out-file before.txt
1..3 |
% { $_ *2 }
"after" | out-file after.txt
echo done
Any ideas why Powershell would have trouble reading a multi-line command, if read from stdin?
I'm going to consider this answered by this:
How to end a multi-line command in PowerShell since it shows that an extra newline is required.
However, I'm going to raise this to MS as a bug since this should not be required when reading from a non-tty, or when -NonInteractive switch is specified.
Please vote on my bug report to the Powershell team.
This is not a complete answer, but from what I can tell, the problem has to do with the input being sent in line by line.
To demonstrate the line-by-line issue, I invoke powershell this way:
powershell.exe -command "gc myfile.txt" | powershell.exe -
vs
powershell.exe -command "gc myfile.txt -raw" | powershell.exe -
The first example replicates what you see with type, the second reads the entire contents of the file, and it works as expected.
It also works from within PowerShell if you put the script contents in a string and pipe it into powershell.exe -.
I had a theory that it had to do with line-by-line input lacking line breaks, but it's not so clear cut. If that were the case, why would the first option work but not the second (removing the line break splitting the single pipeline should have no effect, while removing the line break between the pipeline and the echo should make it fail). Maybe there's something unclear about the way powershell is handling the input with or without line breaks.
I'm a newbie to PowerShell. What's wrong with my script below? It's not wanting to emit the value of $config. However, when I wrap that command in double quotes, everything looks okay.
param($config, $logfolder)
# Must run log analysis in chronological order.
ls $logfolder | Sort-Object LastWriteTime | % {
perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile="$($_.FullName)" -config=$config update
}
# Execute with - .\regen-logs.ps1 webgenesis "C:\inetpub\logs\LogFiles\W3SVC5"
# Returns for each file - Error: Couldn't open config file "awstats.config.conf" nor "awstats.conf" after searching in path "D:\Websites\_awstats\wwwroot\cgi-bin,/etc/awstats,/usr/local/etc/awstats,/etc,/etc/opt/awstats": No such file or directory
As-is, what gets emitted and executed seems to have "-config=$config" passed as an argument. At least, that's my best guess. I don't know if $_ is working correctly either.
If I put quotes around the perl command like so, I get the command I do want to execute.
ls $logfolder | Sort-Object LastWriteTime | % {
"perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile=`"$($_.FullName)`" -config=$config update"
}
# Outputs for each log file something like - perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile="C:\inetpub\logs\LogFiles\W3SVC5\u_ex110602.log" -config=webgenesis update
If putting quotes around it produces the correct commandline, one way to execute the contents of a string is with Invoke-Expression (alias iex):
$v = "myexe -myarg1 -myarg2=$someVar"
iex $v
Put double quotes around "-config=$config". Without this, PowerShell will interpret -config=$config as one string argument that just happens to contain a $ sign in it.
I think you need to start your perl command out with & so that PowerShell interprets things as a command and not a string.
& perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile=`"$($_.FullName)`" -config=$config update
Also, see: Run a program in a foreach
I would like to translate the following Unix 1 Liner to PowerShell.
Synopsis of the command:
This command will search recursively form the PWD (pressent working directory) for any file with the extenstion .jsp, and look inside the file for a simple string match of 'logoutButtonForm'. If it finds a match, it will print the file name and the text that it matched.
find . -name "*.jsp" -exec grep -aH "logoutButtonForm" {}\;
I am new to power shell and have done some googling/binging but have not found a good answer yet.
ls . -r *.jsp | Select-String logoutButtonForm -case
I tend to prefer -Filter over -Include. Guess I never trusted the -Exclude/-Include parameters after observing buggy behavior in PowerShell 1.0. Also, -Filter is significantly faster than using -Include.