Is is possible to get around zsh autocorrect for one specific argument? - autocomplete

So, I''m testing out something and I have to run the same command repeatedly until it works. I'm doing something like this:
curl -X POST -d #filename.xml https://host.name
When I do that, zsh always replies with
zsh: correct '#filename.xml' to 'filename.xml' [nyae]? y
I want to get zsh to stop trying to autocorrect for only this command with only this argument. I eventually just made an alias in my .zshrc file and that solves the problem for me.
I'm just wondering if there is a better way to do this.

Prefix the word with \ to avoid spelling correction:
curl -X POST -d \#filename.xml https://host.name

Related

How to retrieve external files if wget erobots=off is not working?

I would like to download all pdf files linked on a website using wget on Mac OS (zsh).
I have tried:
wget -r -p -k --random-wait --limit-rate=50k -A .pdf -erobots=off https://unfccc.int/process-and-meetings/bodies/constituted-bodies/executive-committee-of-the-warsaw-international-mechanism-for-loss-and-damage-wim-excom/task-force-on-displacement/implementation-updates-task-force-on-displacement\#eq-1
and I have also added the following options to no avail:
--span-hosts
--no-check-certificate
--no-cookies
-H
The error is always the same:
no-follow attribute found in unfccc.int/process-and-meetings/bodies/constituted-bodies/executive-committee-of-the-warsaw-international-mechanism-for-loss-and-damage-wim-excom/task-force-on-displacement/implementation-updates-task-force-on-displacement. Will not follow any links on this page
First make sure you has permission to crawl the pages, I'm not gonna be responsible for bad thing happend to you or anyone after bypassing the robot no-follow attribute !
The tricks to ignore all restrictions and be a bad crawler bot you just need to include :
-e robots=off on your command.
Yes, you've included it on your commands but you got some typos !
Just like #x00 say it need to be -e robots=off instead of -erobots=off.
I Don't know why #x00 doesn't answer the question with his answer :/
Note :
For x00, if you want to me to delete my answer because its nearly similar with yours comment , just command under my answer and i will delete it for you anytime !

Filtering on labels in Docker API not working (possible bug?)

I'm using the Docker API to get info on containers in JSON format. Basically, I want to do a filter based on label values, but it is not working (just returns all containers). This filter query DOES work if you just use the command line docker, i.e.:
docker ps -a -f label=owner=fred -f label=speccont=true
However, if I try to do the equivalent filter query using the API, it just returns ALL containers (no filtering done), i.e.:
curl -s --unix-socket /var/run/docker.sock http:/containers/json?all=true&filters={"label":["speccont=true","owner=fred"]}
Note that I do uri escape the filters param when I execute it, but am just showing it here unescaped for readability.
Am I doing something wrong here? Or does this seem to be a bug in the Docker API? Thanks for any help you can give!
The correct syntax for filtering containers by label as of Docker API v1.41 is
curl -s -G -X GET --unix-socket /var/run/docker.sock http://localhost/containers/json" \
--data 'all=true' \
--data-urlencode 'filters={"label":["speccont=true","owner=fred"]}'
Note the automatic URL encoding as mentioned in this stackexchange post.
I felt there was a bug with API too. But turns out there is none. I am on API version 1.30.
I get desired results with this call:
curl -sS localhost:4243/containers/json?filters=%7B%22ancestor%22%3A%20%5B%222bab985010c3%22%5D%7D
I got the url escaped string using used above with:
python -c 'import urllib; print urllib.quote("""{"ancestor": ["2bab985010c3"]}""")'

What command should I use with curl in terminal if I want to download an image from a website and put it in a certain folder?

I have seen
curl -o project/folder/image.png -OL example.com/image.png
Would this command work?
Edit: I figured it out

Putting curl into postman trouble

I was converting a cURL request from the command line command to the postman format and found something odd.
curl -u testclient:testpass http://localhost.com/ -d 'grant_type=client_credentials'
The above cURL command works in the terminal, but not in the Postman import section. I thought it wasn't a big deal, and that I would do it manually, but I can't seem to figure out what to do with the "-u testclient:testpass" portion of it.
-u testclient:testpass
Could someone please explain to me what this formatting means?
As read on the documentation,
-u, --user <user:password>
stands for user and password. So maybe you could select the Authorization tab, and select Basic Auth on the type. Then add the user and password.

Applescript + REST?

Does AppleScript have a way of interacting with REST APIs?
I realize I can
do shell script curl
I would use curl in
do shell script "#curl script here"
If you need help to get the correct curl statement I recommend postman, It helps me really to generate the right code. But remember If you have these: " to put an escape character in front: \".
So for example if i want to do a POST request to
https://api.widerstandsberechner.ch/api.php?firstcolor=red&secondcolor=orange&thirdcolor=yellow&fourthcolor=silver&hasFiveRings=0&resultInText=0
the AppleScript would look like this:
do shell script "curl -X POST -H \"Cache-Control: no-cache\" \"https://api.widerstandsberechner.ch/api.php?firstcolor=red&secondcolor=orange&thirdcolor=yellow&fourthcolor=silver&hasFiveRings=0&resultInText=0\""
And of course you can easy assign the result as a value:
set res to do shell script "curl -X POST -H \"Cache-Control: no-cache\" \"https://api.widerstandsberechner.ch/api.php?firstcolor=red&secondcolor=orange&thirdcolor=yellow&fourthcolor=silver&hasFiveRings=0&resultInText=0\""
Hope this helps!