Compile text using command line compiler, not a file - command-line

I'm using luac -p file.lua to parse files to check for syntax errors. Is it possible to do something like this:
luac -p | [a bunch of text]
Someone mentioned something about 'piping' but I couldn't figure out how that would help.
What I'm wanting to do is take text from a program I am writing and put all that text into the compiler with -p so it just parses the text. Basically I want to check syntax of my program's textarea without having to write it to a file first.

In bash you can do
luac -p - << EOF
Then type your text. To indicate end, just type
EOF
on new line and press enter.

Related

zsh: command not found: foo

I'm learning the power of Vagrant, and I have stumbled upon this problem. I am trying to create a text named foo.txt with the text "foo" inside of it.
What I type into the terminal is this:
user#User-MBP data % "foo" > foo.txt
Terminal says:
zsh: command not found: foo
Has anyone encountered this? Thank you in advance for your help!
You can't just throw a string at the file like that. You need to use a program like echo to throw it for you.
echo "foo" > foo.txt
To be a little more clear about why, run man echo.
The echo program "writes arguments to the standard output". Whatever argument you give it (i.e. "foo") it will write to standard output.
You follow that with the > operator which "redirects standard output". Read about that here.
You then specify a file to "catch" the redirected standard output (i.e. foo.txt) which you already did just fine.

Exiftool: Want to output to one text file using -w command

I'm currently trying to use exiftool on Windows command prompt to read meta data from multiple files, then output to a single text file.
The exact command I last tried looked like this:
exiftool.exe -FileName -GPSPosition -CreateDate -d "%m:%d:%Y %H:%M:%S" -c "%d° %d' %.2f"\" -charset UTF-8 -ext jpg -w _Coordinate_Date.txt S:\Nick\Test\
When I run this, I get 7 individual text files with the content for one corresponding file in each of them. However, I simply want to output all of it to one single text file. Any help is greatly appreciated
The -w (textout) option can only be used to write multiple files. It is not meant to be used to output to a single file. As per the docs on -w:
It is not possible to specify a simple filename as an argument -- creating a single output file from multiple source files is typically done by shell redirection
Which is what you're doing with the >> ./output.txt part of your command. The -w _Coordinate_Date.txt isn't doing anything and I would think throw an Invalid TAG name: "w _Coordinate_Date.txt" error if quoted together like that as it gets treated as a single arugment. The -w option requires two arguments, the -w and either an extension or a format string.
I actually figured it out, if you wrap the entire -w _Coordinate_Date.txt command in quotations and append it to a file, you can throw all of the output into one text file.
i.e. "-w _Coordinate_Date.txt >> ./output.txt"

"log=..." command-line parameter to send script output to STDOUT? [duplicate]

I'm working with a command line utility that requires passing the name of a file to write output to, e.g.
foo -o output.txt
The only thing it writes to stdout is a message that indicates that it ran successfully. I'd like to be able to pipe everything that is written to output.txt to another command line utility. My motivation is that output.txt will end up being a 40 GB file that I don't need to keep, and I'd rather pipe the streams than work on massive files in a stepwise manner.
Is there any way in this scenario to pipe the real output (i.e. output.txt) to another command? Can I somehow magically pass stdout as the file argument?
Solution 1: Using process substitution
The most convenient way of doing this is by using process substitution. In bash the syntax looks as follows:
foo -o >(other_command)
(Note that this is a bashism. There's similar solutions for other shells, but bottom line is that it's not portable.)
Solution 2: Using named pipes explicitly
You can do the above explicitly / manually as follows:
Create a named pipe using the mkfifo command.
mkfifo my_buf
Launch your other command with that file as input
other_command < my_buf
Execute foo and let it write it's output to my_buf
foo -o my_buf
Solution 3: Using /dev/stdout
You can also use the device file /dev/stdout as follows
foo -o /dev/stdout | other_command
Named pipes work fine, but you have a nicer, more direct syntax available via bash process substitution that has the added benefit of not using a permanent named pipe that must later be deleted (process substitution uses temporary named pipes behind the scenes):
foo -o >(other command)
Also, should you want to pipe the output to your command and also save the output to a file, you can do this:
foo -o >(tee output.txt) | other command
For the sake of making stackoverflow happy let me write a long enough sentence because my proposed solution is only 18 characters long instead of the required 30+
foo -o /dev/stdout
You could use the magic of UNIX and create a named pipe :)
Create the pipe
$ mknod -p mypipe
Start the process that reads from the pipe
$ second-process < mypipe
Start the process, that writes into the pipe
$ foo -o mypipe
foo -o <(cat)
if for some reason you don't have permission to write to /dev/stdout
I use /dev/tty as the output filename, equivalent to using /dev/nul/ when you want to output nothing at all. Then | and you are done.

How to run ngspice from command line & suppress output to terminal?

I'm learning ngspice and would like to suppress the standard output on my terminal. I didn't see a corresponding option in the user manual and am wondering if I simply overlooked it.
It's not the most elegant solution but will do for now:
I simply pipe the output into a temporary text file.
ngspice -b -o <logfile> <circuitfile> > temp.txt

Single line create file with content

The OS is Ubuntu. I want to create file.txt in /home/z/Desktop where the content of the file is some text here.
The first and usual way is run nano /home/z/Desktop/file.txt and type some text here. after that, press ctrl+x, pressy followed by Enter.
The second way is run cat > /home/z/Desktop/file.txt, type some text here and press Enter followed by ctrl+c
I hope I can run single line of command to make it faster. I thought xdotool will work with the cat (the second way), but no, it not works
You can use "echo" in bash. e.g.:
echo "some text here" > file.txt
If you don't want a new line character at the end of the file, use the -n argument:
echo -n "some text here" > file.txt