Greater than in command line - command-line

I'm trying to understand this Scons command:
env.Command('foo.out', 'foo.in', "sed 's/x/y/' < $SOURCE > $TARGET")
What do the < and > mean in sed 's/x/y/' < $SOURCE > $TARGET?

It means that input to sed will be coming from file $SOURCE and output will be saved to $TARGET.

I'm not sure what's scons, but < redirects the given file to the input stream of the given command (in your case, writes file to the input of sed); and > redirects the output stream of the command to some other file.
So, basically, you run sed on $SOURCE file and redirect the results to $TARGET file.

Related

how to prevent this infinite loop in powershell?

While referring to this tutorial on command line after executing the following commands in the PowerShell, it goes in a infinite loop.
echo "I am a new file." > ex15.txt
cat ex15.txt > another.txt
cat *.txt > bigfile.txt
After firing the last command the execution never ends.It goes in an infinite loop. But this works fine in command prompt using the type command.
type *.txt > bigfile.txt
This command doesn't go into an infinite loop. This works perfectly. Why this isn't working in the PowerShell?
Cat (Get-Content) in Powershell works differently than the type command in cmd.
Type will not see the bigfile.txt being written, but Get-Content will and you end up reading bigfile.txt and writing out to the same file so it's stuck in a loop.
To prevent the loop, you can force cat to finish reading all the files before it writes by wrapping the cat expression in parens:
(cat *.txt) > bigfile.txt
As #mjolinor explained, Get-Content is reading the data you're adding to bigfile.txt just as fast as writing it, resulting in a command that will never end.
The easiest solution to this is to put bigfile.txt into a different directory, so it isn't one of the files that you are reading. Eg: The parent directory.
cat *.txt > ..\bigfile.txt
Another variant would be to exclude 'bigfile.txt' from the files in the current path that you want to read:
cat (ls *.txt -Exclude bigfile.txt) > bigfile.txt

Concatenate txt file contents and/or add break to all

I have a bunch of.txt files that need to be made into one big file that can be read by programs such as Microsoft Excel.
The problem is that the files currently do not have a break at the end of them, so they end up in one long line.
Here's an example of what I have (the numbers represent the line number):
1. | first line of txt file
2. | second line
Here's what I want to turn that into:
1. | first line of txt file
2. | second line
3. |
I have around 3000 of these files in a folder, all in the same format. Is there any way to take these files and add a blank line to the end of them all? I'd like to do this without the need for complicated code, i.e. PHP, etc.. I know there are similar things you can do using the terminal (I'm on CentOS), but if something does specifically what I require I'm missing it.
The simplest way to achieve this is with a bash for-loop:
for file in *.txt; do
echo >> "$file"
done
This iterates over all .txt files in the current directory and appends a newline to each file. It can be written in one line, you only need to add a ; before the done.
Note that $file is quoted to handle files with spaces and other funny characters in their names.
If the files are spread across many directories and not all in the same one, you can replace *.txt with **/*.txt to iterate over all .txt files in all subdirectories of the current folder.
An alternative way is to use sed:
sed -i "$ s:$:\n:" *.txt
The -i flag tells sed to edit the files in-place. $ matches the last line, and then the s command substitutes the end of the line (again $) with a new line (\n), thus appending a line to the end of the file.
Try this snippet:
for f in *; do ((cat $f && echo "") > $f.tmp) done && rename -f 's/\.tmp$//' *.tmp
This basically takes any file in the folder (for f in *; do).
Outputs the file on STDOUT (cat $f) followed by a newline (echo "")
and redirects the output into filename.tmp (> $f.tmp)
and then moves the *.tmp files to the original files (rename -f 's/\.tmp$//' *.tmp).
Edit:
Or even simpler:
for f in *; do (echo "" >> $f) done
This basically takes any file in the folder (for f in *; do).
Outputs a newline (echo "")
and appends it to the file (>> $f)

How can I redirect the output of a command (running in a batch loop) to a file?

I have a windows batch file, which iterates over files in a folder and runs a command on each file. Specifically I am running xmllint to validate some files:
for %%i in (c:\temp\*.xml) do (
C:\XMLLINT\xmllint -noout -schema "C:\schemas\schema.xsd" "%%~dpnxi" >> c:\output.txt
)
It currently shows the output on screen. I want to show the output of all these commands placed in an output file. How can I achieve this? By using the append operator (>>) nothing is accomplished except for a blank file being created.
Is it because of xmllint?
If you're trying to redirect error output from the program, it might be writing to stderr. You can try to redirect it with:
for %%i in (c:\temp\*.xml) do (
C:\XMLLINT\xmllint -noout -schema "C:\schemas\schema.xsd" "%%~dpnxi" >> c:\output.txt 2>&1
)
Basically the 2>&1 at the end means redirect anything from stderr (which is 2) to stdout (which is 1). Since stdout is redirected to a file, you should now see the stderr stream in the file. Hope this works for you!
I've never used it, but if its documentation is here, have you tried just removing your "-noout" option, or adding an: "-output c:\output.txt"?

redirect input in command line

I am not really getting the redirect input in DOS mode.
I know the working example: sort < list.txt
which sorts the content of my list.txt
but why doesn't this work:
dir < arguments.txt
the content of my arguments.txt file is for instance just: -D
I would expect the command
dir < arguments.txt
be equal to
dir /D
why isn't this working?
thanks
juergen
The < operator redirects console input to be from a file. The cooresponding > operator redirects the console output to be to a file.
The sort command reads the console until it reaches end of file (cntl-Z) and then it produces the sorted result.
The dir command does not accept console input, only arguements on the command line, so the the file containing /D is never read.

Send the result of multiple commands to one text file?

This is what I have so far - my dropbox public URL creation script for a directory of public URLs (getdropbox.com - gpl I think). My LIST file was created using ls in the following fashion:
ls -d ~/Dropbox/Public/PUBLICFILES/* > LIST
dropboxpuburl.sh:
for PATH in `cat LIST`
do
echo $PATH
dropbox puburl $PATH > ~/URLLIST/$PATH
done
Now this creates a whole series of files - each with the dropbox puburl in them.
The question is: How can I cause this script to redirect all the public links into one text file, each on a new line - perhaps with the name PUBLIC-DIRECTORY-LIST?
Is this what you are trying to achieve?
for PATH in `cat LIST`
do
echo $PATH
dropbox puburl $PATH >> filename
done
OK, I've got it working using the suggestions given to me here:
for PATH in `cat LIST`
do
echo $PATH
dropbox puburl $PATH
done > PUBLIC-DIRECTORY-LIST
It creates a list of the directories, and below them the public link. Now it is time to prune the directories for a clean text file of links.
The => creates the files and adds something to the first line. >> appends to it on a new line.
echo txt=>PUBLIC-DIRECTORY-LIST.txt |
echo another text>>PUBLIC-DIRECTORY-LIST.txt
You should use while read with input redirection instead of for with cat filename. Also, in order to avoid variable name conflicts I changed your path variable to lowercase since the shell uses the all-caps one already. It won't affect your interactive shell, but it could affect something in your script.
Also, assuming that you want the lines from your input file to be displayed to the screen as a progress indicator, but not captured in your output file, this echo sends it to stderr.
while read path
do
echo $path >&2
dropbox puburl $path
done < LIST > PUBLIC-DIRECTORY-LIST