While referring to this tutorial on command line after executing the following commands in the PowerShell, it goes in a infinite loop.
echo "I am a new file." > ex15.txt
cat ex15.txt > another.txt
cat *.txt > bigfile.txt
After firing the last command the execution never ends.It goes in an infinite loop. But this works fine in command prompt using the type command.
type *.txt > bigfile.txt
This command doesn't go into an infinite loop. This works perfectly. Why this isn't working in the PowerShell?
Cat (Get-Content) in Powershell works differently than the type command in cmd.
Type will not see the bigfile.txt being written, but Get-Content will and you end up reading bigfile.txt and writing out to the same file so it's stuck in a loop.
To prevent the loop, you can force cat to finish reading all the files before it writes by wrapping the cat expression in parens:
(cat *.txt) > bigfile.txt
As #mjolinor explained, Get-Content is reading the data you're adding to bigfile.txt just as fast as writing it, resulting in a command that will never end.
The easiest solution to this is to put bigfile.txt into a different directory, so it isn't one of the files that you are reading. Eg: The parent directory.
cat *.txt > ..\bigfile.txt
Another variant would be to exclude 'bigfile.txt' from the files in the current path that you want to read:
cat (ls *.txt -Exclude bigfile.txt) > bigfile.txt
Related
I'm wanting to pass arbitrary scripts to Powershell via stdin.
(In practice, I'd like to avoid having to put the script into a temporary file, but for the purposes of this question I will pipe the contents of a file to powershell.)
So I'm doing something like so (in this example, from a Windows cmd shell):
type myfile.txt | powershell -
It works if myfile.txt contains something like this:
1..3 | % { $_ *2 }
echo done
(It outputs 2\n4\n6\ndone.)
However, if I split this first statement across multiple lines like so, then Powershell simply exists without generating any output at all:
1..3 |
% { $_ *2 }
echo done
This seems to fail for any multiline statement. For example, this also fails to produce output:
1..3 | % {
$_ *2 }
echo done
I'm surprised by this since each are legal Powershell scripts that would work normally if placed into a .ps1 file and run as normal.
I've tried various things including escaping the EOL using line continuation chars, to no avail. The same effect occurs if the parent shell is Powershell, or even Python (using subprocess.Popen with stdin=PIPE). In each case, Powershell exits without any error, and the exit code is 0.
Interestingly, if I run the following, only "before.txt" gets created.
"before" | out-file before.txt
1..3 |
% { $_ *2 }
"after" | out-file after.txt
echo done
Any ideas why Powershell would have trouble reading a multi-line command, if read from stdin?
I'm going to consider this answered by this:
How to end a multi-line command in PowerShell since it shows that an extra newline is required.
However, I'm going to raise this to MS as a bug since this should not be required when reading from a non-tty, or when -NonInteractive switch is specified.
Please vote on my bug report to the Powershell team.
This is not a complete answer, but from what I can tell, the problem has to do with the input being sent in line by line.
To demonstrate the line-by-line issue, I invoke powershell this way:
powershell.exe -command "gc myfile.txt" | powershell.exe -
vs
powershell.exe -command "gc myfile.txt -raw" | powershell.exe -
The first example replicates what you see with type, the second reads the entire contents of the file, and it works as expected.
It also works from within PowerShell if you put the script contents in a string and pipe it into powershell.exe -.
I had a theory that it had to do with line-by-line input lacking line breaks, but it's not so clear cut. If that were the case, why would the first option work but not the second (removing the line break splitting the single pipeline should have no effect, while removing the line break between the pipeline and the echo should make it fail). Maybe there's something unclear about the way powershell is handling the input with or without line breaks.
My command line is this (powershell):
$7z ="`"c:\Program Files\7-Zip\7z.exe`""
&$7z a -r -ttar -bd -so . | &$7z a -r -txz -bd $archive -si
The produced archive file indeed contains a tar file, but that tar file is corrupt.
Note, that breaking the pipe into two commands works correctly:
&$7z a -r -ttar -bd ${archive}.tmp .
&$7z a -r -txz -bd $archive ${archive}.tmp
The produced archive is perfectly valid.
So, what is wrong with my pipeline?
(I am using Powershell)
Nothing is wrong with your pipeline it is the way that the pipeline works that's causing the error.
PowerShell pipe works in an asynchronous way. Meaning that output of the first command is available to the second command immediately one object at the time even if the first one has not finished executing, See here.
Both Unix and PowerShell pipes operate in the same way. The reason why you might be seeing a difference from Unix to PowerShell is the way in which they go about it is different.
Unix passes Strings between the commands. Where as a Powershell pipe will pass full-fledged .net object between commands. This difference in the data type being past between command will be why it works on unix and not in PowerShell. If 7z.exe can not huddle these .net objects correctly the files will be come corrupt, See here.
Try adding | %{ "$_" } in between the pipes like
&$7z a -r -ttar -bd -so . | %{ "$_" } | &$7z a -r -txz -bd $archive -si
The point is that the second call to 7z expects unmodified data on STDIN, but PowerShell is converting the output from the first call to 7z to (multiple) (string) objects. % is an alias for foreach-object, so what the additional command does is to loop over each object and convert it to a plain string before passing it on to the second call to 7z.
Edit: Reading through PowerShell’s Object Pipeline Corrupts Piped Binary Data it looks to me now as if my suggestion would not work, and there's also no way to fix it. Well, other than wrapping the whole pipeline into a cmd /c "..." call to make cmd and not PowerShell handle the pipeline.
Edit2: I also was trying this solution from the PowerShell Cookbook, but it was very slow.
In the end, I created a .cmd script with the 7z pipes that I'm calling from my PowerShell script.
I have a bunch of.txt files that need to be made into one big file that can be read by programs such as Microsoft Excel.
The problem is that the files currently do not have a break at the end of them, so they end up in one long line.
Here's an example of what I have (the numbers represent the line number):
1. | first line of txt file
2. | second line
Here's what I want to turn that into:
1. | first line of txt file
2. | second line
3. |
I have around 3000 of these files in a folder, all in the same format. Is there any way to take these files and add a blank line to the end of them all? I'd like to do this without the need for complicated code, i.e. PHP, etc.. I know there are similar things you can do using the terminal (I'm on CentOS), but if something does specifically what I require I'm missing it.
The simplest way to achieve this is with a bash for-loop:
for file in *.txt; do
echo >> "$file"
done
This iterates over all .txt files in the current directory and appends a newline to each file. It can be written in one line, you only need to add a ; before the done.
Note that $file is quoted to handle files with spaces and other funny characters in their names.
If the files are spread across many directories and not all in the same one, you can replace *.txt with **/*.txt to iterate over all .txt files in all subdirectories of the current folder.
An alternative way is to use sed:
sed -i "$ s:$:\n:" *.txt
The -i flag tells sed to edit the files in-place. $ matches the last line, and then the s command substitutes the end of the line (again $) with a new line (\n), thus appending a line to the end of the file.
Try this snippet:
for f in *; do ((cat $f && echo "") > $f.tmp) done && rename -f 's/\.tmp$//' *.tmp
This basically takes any file in the folder (for f in *; do).
Outputs the file on STDOUT (cat $f) followed by a newline (echo "")
and redirects the output into filename.tmp (> $f.tmp)
and then moves the *.tmp files to the original files (rename -f 's/\.tmp$//' *.tmp).
Edit:
Or even simpler:
for f in *; do (echo "" >> $f) done
This basically takes any file in the folder (for f in *; do).
Outputs a newline (echo "")
and appends it to the file (>> $f)
I'm a newbie to PowerShell. What's wrong with my script below? It's not wanting to emit the value of $config. However, when I wrap that command in double quotes, everything looks okay.
param($config, $logfolder)
# Must run log analysis in chronological order.
ls $logfolder | Sort-Object LastWriteTime | % {
perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile="$($_.FullName)" -config=$config update
}
# Execute with - .\regen-logs.ps1 webgenesis "C:\inetpub\logs\LogFiles\W3SVC5"
# Returns for each file - Error: Couldn't open config file "awstats.config.conf" nor "awstats.conf" after searching in path "D:\Websites\_awstats\wwwroot\cgi-bin,/etc/awstats,/usr/local/etc/awstats,/etc,/etc/opt/awstats": No such file or directory
As-is, what gets emitted and executed seems to have "-config=$config" passed as an argument. At least, that's my best guess. I don't know if $_ is working correctly either.
If I put quotes around the perl command like so, I get the command I do want to execute.
ls $logfolder | Sort-Object LastWriteTime | % {
"perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile=`"$($_.FullName)`" -config=$config update"
}
# Outputs for each log file something like - perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile="C:\inetpub\logs\LogFiles\W3SVC5\u_ex110602.log" -config=webgenesis update
If putting quotes around it produces the correct commandline, one way to execute the contents of a string is with Invoke-Expression (alias iex):
$v = "myexe -myarg1 -myarg2=$someVar"
iex $v
Put double quotes around "-config=$config". Without this, PowerShell will interpret -config=$config as one string argument that just happens to contain a $ sign in it.
I think you need to start your perl command out with & so that PowerShell interprets things as a command and not a string.
& perl D:\Websites\_awstats\wwwroot\cgi-bin\awstats.pl -LogFile=`"$($_.FullName)`" -config=$config update
Also, see: Run a program in a foreach
I would like to translate the following Unix 1 Liner to PowerShell.
Synopsis of the command:
This command will search recursively form the PWD (pressent working directory) for any file with the extenstion .jsp, and look inside the file for a simple string match of 'logoutButtonForm'. If it finds a match, it will print the file name and the text that it matched.
find . -name "*.jsp" -exec grep -aH "logoutButtonForm" {}\;
I am new to power shell and have done some googling/binging but have not found a good answer yet.
ls . -r *.jsp | Select-String logoutButtonForm -case
I tend to prefer -Filter over -Include. Guess I never trusted the -Exclude/-Include parameters after observing buggy behavior in PowerShell 1.0. Also, -Filter is significantly faster than using -Include.