Redirecting of jq output - redirect

In a terminal this works fine:
mosquitto_sub -h 192.168.178.20 -t tele/POW/SENSOR/# | jq '.ENERGY|.Power'
Every 10 seconds there is an output on screen because the device POW publishes it's sensor dates every 10 seconds. The output of mosquitto_sub (it's a JSON string) is piped to jq and jq shows only the value digit of the key 'Power'. Now I try to store the jq output (only the value) to a file 'output.log'.
mosquitto_sub -h 192.168.178.20 -t tele/POW/SENSOR/# | jq '.ENERGY|.Power' > output.log
is not working. What is going wrong?

From the jq manual:
--unbuffered
Flush the output after each JSON object is printed
(useful if you’re piping a slow data source into
jq and piping jq’s output elsewhere).

Related

How to get the current Nodes where the Job is running

I'm developing a job and the user can choose in which nodes can run, so the Node Filter is open to the convenient of the user.
When the job is starting I need to do a calculation based in the number of nodes chosed by the user, exist a way to get this number?
Regards,
Alejandro L
By design, that information is only available after the execution, so, a good approach is to call the job via API (in a script step) and with the execution ID number (available in the API call output) you can list and count the nodes, e.g:
#!/bin/bash
nodes=$(curl -s -X GET "http://localhost:4440/api/41/execution/16" \
--header "Accept: application/json" \
--header "X-Rundeck-Auth-Token: your_user_token" \
| jq -r '.successfulNodes | . []')
number_of_nodes=$(echo $nodes | wc -w)
echo "Number of nodes: $number_of_nodes"
This example needs jq to extract the nodes from the API response.
Anyway, your request sounds good for an enhancement request, please suggest that here.
a workaround to take would be using job.filter variable
so if you do #job.filter#
it returns a string with the list of nodes like us-east-1-0,us-east-1-1,us-east-1-2
if you save it as a string, and then split the string on ',' then you get an array of nodes:
IFS=',' read -r -a array <<< "$string"
and then you can get the number of nodes by
echo ${#array[#]}
Note
as #MegaDrive68k mentioned this won't work if use select all node with the use of filter .*

finding when a file was introduced on github

This curl command works as expected and shows when the repository was first created.
curl https://api.github.com/repos/RaRe-Technologies/gensim | grep created
"created_at": "2011-02-10T07:43:04Z",
But this does not show when a file in that repo was created.
curl
https://api.github.com/repos/RaRe-Technologies/gensim/blob/develop/gensim/scripts/make_wikicorpus.py
| grep created
Is there any way to find the date on which the file was introduced?
You can use https://api.github.com/repos/OWNER/REPO/commits?path=<path/to/file>, as described here.
The results of this request can then be parsed by jq, with the following options .[-1].commit.author.date.
This tells jq to get the last item of the array ([-1]), and then parse the value of commit, then author and then the date, which is the date of the commit.
So using the follwing command
curl "https://api.github.com/repos/RaRe-Technologies/gensim/commits?path=gensim/scripts/make_wikicor
pus.py" | jq -r ".[-1].commit.author.date"
will result in
2012-07-21T20:00:29Z
The alternative, using GitHub CLI gh, and its gh api command:
# Windows
gh api -X GET repos/RaRe-Technologies/gensim/commits \
-f path="gensim/scripts/make_wikicorpus.py" \
--jq ".[-1].commit.author.date"
# Linux
gh api -X GET repos/RaRe-Technologies/gensim/commits \
-f path='gensim/scripts/make_wikicorpus.py' \
--jq '.[-1].commit.author.date'
2012-07-21T20:00:29Z
You don't even need jq if you have gh installed.

Powershell: loop for equivalent

Currently, I'm performing this command on linux shell:
for binary in $(curl -s -X GET "${FHIR_SERVER}/\$export-poll-status?_jobId=${JOB_ID}" -H "Authorization: Bearer ${ACCESS_TOKEN}" | jq -r ".output[].url"); \
do wget --header="Authorization: Bearer ${ACCESS_TOKEN}" ${binary} -O ->>patients-pre.json;
done
Is there any way to get this on powershell?
While not knowing exactly the format of your incoming json, I'd use something like this in powershell / pwsh:
curl ... | % { wget (convertto-json $_).output.url } >> patients-pre.json
The % is an alias for ForEach-Object which will iterate over objects (or lines of text) sent from the left side of the pipe. Powershell is interesting because when using the pipe symbol, there's an implicit for / do / done operation going on.
curl is on all modern versions of windows, so you don't need to change that. wget isn't, but you could install it, or use curl again.
If you want to go full powershell, look at invoke-restmethod ( https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-restmethod?view=powershell-7.2 ). This can do the jobs of both curl and wget in your example, and also automatically handle json returns to give you a structured object instead of text (which is what convertto-json is doing above.)

Raspivid save to disk and stream concurrently

I am trying to run a home security camera using Rasberry Pi Model B
I want to save the stream to a file locally (USB if possible) and also stream so I can pick this up on my network
The command I have is not working for both - any suggestions?
raspivid-o security.h264 -t 0 -n -w 600 -h 400 -fps 12 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264
Try this command:
raspivid -o - -t 0 -n -w 600 -h 400 -fps 12 | tee security.h264 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264
The tee command writes the output to the standard output and to the specified files.

How to enforce docopt to parse one option only?

folks!
I am trying the docopt (cpp variant). I have tried this variant:
Usage:
prog [-o | --out-file=<out-file>] <in-file>
prog -h | --help
prog --version
Options:
-h --help Show this screen.
--version Show version.
-o, --out-file=<out-file> Output file name [default: stdout].
<in-file> Input file.
I expected that docopt expects zero or one out-file option and it gives me a string as a result, but it can accept two or more this options and gives me a string-list value.
Is this right?
I found that it works as expected when I corrected the command-line description like this:
Usage:
prog [-o<out-file>|--out-file=<out-file>] <in-file>
prog -h | --help
prog --version
Options:
-h --help Show this screen.
--version Show version.
-o, --out-file=<out-file> Output file name [default: stdout].
<in-file> Input file.
or even like this:
Usage:
prog [-o<out-file>] <in-file>
prog -h | --help
prog --version
Options:
-h --help Show this screen.
--version Show version.
-o, --out-file=<out-file> Output file name [default: stdout].
<in-file> Input file.