(Postman) newman outputs unicode with parameter --disable-unicode - unicode

I want to disable unicode for the newman output. I have read newman's documentation and it says to use --disable-unicode
Newman version: 3.9.4
--disable-unicode
Specify whether or not to force the unicode disable option. When supplied, all symbols in the output will be replaced by their plain text equivalents.
command:
newman run "ACSF_PE.postman_collection.json" --global-var "HEROKU_APP_NAME=myapp" --global-var "HEROKU_API_TOKEN=********" -r cli,junit,text --disable-unicode --reporter-junit-export Test-Results.xml --reporter-text-export Test-Results.txt
First I had the command withut --disable-unicode parameter, but I have added the parameter and the output does not change, it's exactly the same...
output:
...
2018-05-30T10:32:19.4940350Z [0mnewman[0m
2018-05-30T10:32:19.4940350Z
2018-05-30T10:32:19.4940350Z [0mAccenture Cloud TPM[0m
2018-05-30T10:32:19.5252847Z
2018-05-30T10:32:19.5252847Z Ôûí [0mPre-collection Requests[0m
2018-05-30T10:32:19.5252847Z Ôöö [0mRetrieve Credentials[0m
2018-05-30T10:32:20.1443464Z [90mGET[39m [90mhttps://api.heroku.com/apps/cas-dev-pipe-app/config-vars[39m [90m[200 OK, 2.55KB, 570ms][39m
2018-05-30T10:32:20.1933674Z [32m  [39m [90mStatus code is 200[39m
2018-05-30T10:32:20.1933674Z [32m  [39m [90mRequired Config Vars[39m
2018-05-30T10:32:20.2089943Z [32m  [39m [90mRequired credentials exist[39m
2018-05-30T10:32:20.2402436Z
2018-05-30T10:32:20.2402436Z Ôöö [0mAuthorize: Login & Get JWT (HS256)[0m
2018-05-30T10:32:20.4485595Z [90mPOST[39m [90mhttps://cas-dev-pipe-app.herokuapp.com/auth/login[39m [90m[404 Not Found, 431B, 160ms][39m
2018-05-30T10:32:20.4495595Z {"cursor":{"ref":"eeb9be61-2bea-4b89-9045-7218f94bdd80","length":45,"cycles":1,"position":1,"iteration":0,"httpRequestId":"8f6b275f-da45-4965-82cc-1534b21f70dd"},"response":{"id":"5a4def44-9f82-41f3-b192-aafa1f6a8f6c","status":"Not Found","code":404,"header":[{"key":"Server","value":"Cowboy"},{"key":"Connection","value":"keep-alive"},{"key":"Content-Security-Policy","value":"default-src 'self'"},{"key":"X-Content-Type-Options","value":"nosniff"},{"key":"Content-Type","value":"text/html; charset=utf-8"},{"key":"Content-Length","value":"150"},{"key":"Vary","value":"Accept-Encoding"},{"key":"Date","value":"Wed, 30 May 2018 10:32:20 GMT"},{"key":"Via","value":"1.1 vegur"}],"stream":{"type":"Buffer","data":[60,33,68,79,67,84,89,80,69,32,104,116,109,108,62,10,60,104,116,109,108,32,108,97,110,103,61,34,101,110,34,62,10,60,104,101,97,100,62,10,60,109,101,116,97,32,99,104,97,114,115,101,116,61,34,117,116,102,45,56,34,62,10,60,116,105,116,108,101,62,69,114,114,111,114,60,47,116,105,116,108,101,62,10,60,47,104,101,97,100,62,10,60,98,111,100,121,62,10,60,112,114,101,62,67,97,110,110,111,116,32,80,79,83,84,32,47,97,117,116,104,47,108,111,103,105,110,60,47,112,114,101,62,10,60,47,98,111,100,121,62,10,60,47,104,116,109,108,62,10]},"cookie":[],"responseTime":160,"responseSize":150},"request":{"description":{"content":"","type":"text/plain"},"url":{"protocol":"https","path":["auth","login"],"host":["cas-dev-pipe-app","herokuapp","com"],"query":[],"variable":[]},"header":[{"key":"Content-Type","value":"application/json"},{"key":"User-Agent","value":"PostmanRuntime/7.1.6"},{"key":"Accept","value":"*/*"},{"key":"Host","value":"cas-dev-pipe-app.herokuapp.com"},{"key":"accept-encoding","value":"gzip, deflate"},{"key":"content-length","value":117}],"method":"POST","body":{"mode":"raw","raw":"{\n\t\"username\" : \"ub1igm55lrs849\",\n\t\"password\" : \"p220b577c3cd5908fa061b6469520c391b70e11bf0d56d0096d29e291b4e1681d\"\n}"},"auth":{"type":"noauth","noauth":[]}},"item":{"id":"5027788f-fe2a-4c67-b44d-aad0d95710ac","name":"Authorize: Login & Get JWT (HS256)","request":{"description":{"content":"","type":"text/plain"},"url":{"path":["auth","login"],"host":["{{ACSF_PE_URL}}"],"query":[],"variable":[]},"header":[{"key":"Content-Type","value":"application/json"}],"method":"POST","body":{"mode":"raw","raw":"{\n\t\"username\" : \"{{DB_username}}\",\n\t\"password\" : \"{{DB_password}}\"\n}"},"auth":{"type":"noauth","noauth":[]}},"response":[],"event":[{"listen":"test","script":{"id":"b1c3e65e-0032-484e-b129-1d82b9cc51a7","type":"text/javascript","exec":["","//Code","pm.test(\"Status code is 200\", function () {"," pm.response.to.have.status(200);","});","","//Body","pm.test(\"Body has required attributes\", function () {"," pm.response.to.have.jsonBody('token');"," pm.response.to.have.jsonBody('user.role', pm.variables.get('DB_username'));","});","","","//Post scripts","// Save the token to a Postman environment variable","postman.setEnvironmentVariable(\"JWT_POSTGRAPHILE\", pm.response.json().token);","",""]}},{"listen":"prerequest","script":{"id":"f77d57be-6e67-47d9-a1d3-d0f468638bb7","type":"text/javascript","exec":[""],"_lastExecutionId":"1ce3fc83-4fa7-4c7d-94e4-9b15be2423a9"}}]},"cookies":[]}
2018-05-30T10:32:20.4627101Z [31m[1m 1.[22m[39m [31m[1mStatus code is 200[22m[39m
2018-05-30T10:32:20.4627101Z [31m[1m 2.[22m[39m [31m[1mBody has required attributes[22m[39m
2018-05-30T10:32:20.4783328Z [31m[1m 3Ôáä JSONError in test-script[22m[39m
2018-05-30T10:32:20.5095846Z
2018-05-30T10:32:20.5095846Z Ôöö [0mSTD: Populate Data-Variables[0m
2018-05-30T10:32:22.0723737Z [90mPOST[39m [90mhttp://localhost:5000/graphql[39m {"code":"ECONNREFUSED","errno":"ECONNREFUSED","syscall":"connect","address":"127.0.0.1","port":5000}
2018-05-30T10:32:22.0723737Z {"cursor":{"ref":"a5fc877c-781e-42fd-823b-955fb1a3964a","length":45,"cycles":1,"position":2,"iteration":0,"httpRequestId":"087f0c67-215d-4de4-92d7-dfcb50f4d2ad"},"request":{"description":{"content":"","type":"text/plain"},"url":{"protocol":"http","port":"5000","path":["graphql"],"host":["localhost"],"query":[],"variable":[]},"header":[{"key":"Content-Type","value":"application/json"},{"key":"Authorization","value":"Bearer {{JWT_POSTGRAPHILE}}","system":true}],"method":"POST","body":{"mode":"raw","raw":"{\r\n\t\"query\": \"{ allProductCs{nodes{externalidC sfid}} allAccountExtensionCs{nodes{externalidC sfid}} allKpiSetCs{nodes{name sfid}} allSalesOrganizationCs{nodes{name sfid}}}\"\r\n}"}},"item":{"id":"8d856d77-3204-4b22-9000-702ecc0ac059","name":"STD: Populate Data-Variables","request":{"description":{"content":"","type":"text/plain"},"url":{"protocol":"http","port":"5000","path":["graphql"],"host":["localhost"],"query":[],"variable":[]},"header":[{"key":"Content-Type","value":"application/json"}],"method":"POST","body":{"mode":"raw","raw":"{\r\n\t\"query\": \"{ allProductCs{nodes{externalidC sfid}} allAccountExtensionCs{nodes{externalidC sfid}} allKpiSetCs{nodes{name sfid}} allSalesOrganizationCs{nodes{name sfid}}}\"\r\n}"}},"response":[],"event":[{"listen":"test","script":{"id":"d5af99b0-e755-41d4-aa9d-f0c8bdfb7beb","type":"text/javascript","exec":["","pm.test(\"Status code is 200\", function () {"," pm.response.to.have.status(200);","});","","var response = pm.response.json();","","//Body","pm.test(\"Body has required attributes\", function () {"," pm.response.to.have.jsonBody('data.allProductCs.nodes');"," pm.response.to.have.jsonBody('data.allAccountExtensionCs.nodes');"," pm.response.to.have.jsonBody('data.allKpiSetCs.nodes');"," pm.response.to.have.jsonBody('data.allSalesOrganizationCs.nodes');","});","","// product__c","Object.getOwnPropertyNames(pm.environment.toObject()).filter("," function(d){"," return /product__c./.test(d);"," }).forEach(function(key) {"," pm.environment.unset(key);"," });","","response.data.allProductCs.nodes.forEach(function(node) {"," pm.environment.set('product__c.' + node.externalidC, node.sfid);","});","","","// account_extension__c","Object.getOwnPropertyNames(pm.environment.toObject()).filter("," function(d){"," return /account_extension__c./.test(d);"," }).forEach(function(key) {"," pm.environment.unset(key);"," });","response.data.allAccountExtensionCs.nodes.forEach(function(node) {"," pm.environment.set('account_extension__c.' + node.externalidC, node.sfid);","});","","// kpi_set__c","Object.getOwnPropertyNames(pm.environment.toObject()).filter("," function(d){"," return /kpi_set__c./.test(d);"," }).forEach(function(key) {"," pm.environment.unset(key);"," });","response.data.allKpiSetCs.nodes.forEach(function(node) {"," pm.environment.set('kpi_set__c.' + node.externalidC, node.sfid);","});","","// sales_organization__c","Object.getOwnPropertyNames(pm.environment.toObject()).filter("," function(d){"," return /sales_organization__c./.test(d);"," }).forEach(function(key) {"," pm.environment.unset(key);"," });","response.data.allSalesOrganizationCs.nodes.forEach(function(node) {"," pm.environment.set('sales_organization__c.' + node.externalidC, node.sfid);","});","","postman.setNextRequest(null);"]}}]}}
2018-05-30T10:32:22.0879992Z [31m[1m 5.[22m[39m [31m[1mStatus code is 200[22m[39m
2018-05-30T10:32:22.0879992Z [31m[1m 6Ôáä JSONError in test-script[22m[39m
...
NOTE: output unicode chars are not shown when I save this post

Solution:
add parameter --no-color

Setting --color off worked in my case.

Related

Varnish - how to check JWT signature using digest Vmod?

I have a DockerFile based on Varnish 7.0 alpine, I have a custom vcl file to handle JWT authentication. We pass the JWT as a Bearer in the header.
I am based on this example: https://feryn.eu/blog/validating-json-web-tokens-in-varnish/
set req.http.tmpPayload = regsub(req.http.x-token,"[^\.]+\.([^\.]+)\.[^\.]+$","\1");
set req.http.tmpHeader = regsub(req.http.x-token,"([^\.]+)\.[^\.]+\.[^\.]+","\1");
set req.http.tmpRequestSig = regsub(req.http.x-token,"^[^\.]+\.[^\.]+\.([^\.]+)$","\1");
set req.http.tmpCorrectSig = digest.base64url_nopad_hex(digest.hmac_sha256(std.fileread("/jwt/privateKey.pem"), req.http.tmpHeader + "." + req.http.tmpPayload));
std.log("req sign " + req.http.tmpRequestSig);
std.log("calc sign " + req.http.tmpCorrectSig);
if(req.http.tmpRequestSig != req.http.tmpCorrectSig) {
std.log("invalid signature match");
return(synth(403, "Invalid JWT signature"));
}
My problem is that tmpCorrectSig is empty, I don't know if I can load from a file, since my file contains new lines and other caracteres ?
For information, this Vmod is doing what I want: https://code.uplex.de/uplex-varnish/libvmod-crypto, but I can't install it on my Arm M1 pro architecture, I spent so much time trying...
Can I achieve what I want?
I have a valid solution that leverages the libvmod-crypto. The VCL supports both HS256 and RS256.
These are the commands I used to generated the certificates:
cd /etc/varnish
ssh-keygen -t rsa -b 4096 -m PEM -f jwtRS256.key
openssl rsa -in jwtRS256.key -pubout -outform PEM -out jwtRS256.key.pub
I use https://jwt.io/ to generate a token and paste in the values from my certificates to encrypt the signature.
The VCL code
This is the VCL code that will extract the JWT from the token cookie:
vcl 4.1;
import blob;
import digest;
import crypto;
import std;
sub vcl_init {
new v = crypto.verifier(sha256,std.fileread("/etc/varnish/jwtRS256.key.pub"));
}
sub vcl_recv {
call jwt;
}
sub jwt {
if(req.http.cookie ~ "^([^;]+;[ ]*)*token=[^\.]+\.[^\.]+\.[^\.]+([ ]*;[^;]+)*$") {
set req.http.x-token = ";" + req.http.Cookie;
set req.http.x-token = regsuball(req.http.x-token, "; +", ";");
set req.http.x-token = regsuball(req.http.x-token, ";(token)=","; \1=");
set req.http.x-token = regsuball(req.http.x-token, ";[^ ][^;]*", "");
set req.http.x-token = regsuball(req.http.x-token, "^[; ]+|[; ]+$", "");
set req.http.tmpHeader = regsub(req.http.x-token,"token=([^\.]+)\.[^\.]+\.[^\.]+","\1");
set req.http.tmpTyp = regsub(digest.base64url_decode(req.http.tmpHeader),{"^.*?"typ"\s*:\s*"(\w+)".*?$"},"\1");
set req.http.tmpAlg = regsub(digest.base64url_decode(req.http.tmpHeader),{"^.*?"alg"\s*:\s*"(\w+)".*?$"},"\1");
if(req.http.tmpTyp != "JWT") {
return(synth(400, "Token is not a JWT: " + req.http.tmpHeader));
}
if(req.http.tmpAlg != "HS256" && req.http.tmpAlg != "RS256") {
return(synth(400, "Token does not use a HS256 or RS256 algorithm"));
}
set req.http.tmpPayload = regsub(req.http.x-token,"token=[^\.]+\.([^\.]+)\.[^\.]+$","\1");
set req.http.tmpRequestSig = regsub(req.http.x-token,"^[^\.]+\.[^\.]+\.([^\.]+)$","\1");
if(req.http.tempAlg == "HS256") {
set req.http.tmpCorrectSig = digest.base64url_nopad_hex(digest.hmac_sha256("SlowWebSitesSuck",req.http.tmpHeader + "." + req.http.tmpPayload));
if(req.http.tmpRequestSig != req.http.tmpCorrectSig) {
return(synth(403, "Invalid HS256 JWT signature"));
}
} else {
if (! v.update(req.http.tmpHeader + "." + req.http.tmpPayload)) {
return (synth(500, "vmod_crypto error"));
}
if (! v.valid(blob.decode(decoding=BASE64URLNOPAD, encoded=req.http.tmpRequestSig))) {
return(synth(403, "Invalid RS256 JWT signature"));
}
}
set req.http.tmpPayload = digest.base64url_decode(req.http.tmpPayload);
set req.http.X-Login = regsub(req.http.tmpPayload,{"^.*?"login"\s*:\s*(\w+).*?$"},"\1");
set req.http.X-Username = regsub(req.http.tmpPayload,{"^.*?"sub"\s*:\s*"(\w+)".*?$"},"\1");
unset req.http.tmpHeader;
unset req.http.tmpTyp;
unset req.http.tmpAlg;
unset req.http.tmpPayload;
unset req.http.tmpRequestSig;
unset req.http.tmpCorrectSig;
unset req.http.tmpPayload;
}
}
Installing libvmod-crypto
libvmod-crypto is required to use RS256, which is not supported by libvmod-digest.
Unfortunately I'm getting an error when running the ./configure script:
./configure: line 12829: syntax error: unexpected newline (expecting ")")
I'll talk to the maintainer of the VMOD and see if we can figure out someway to fix this. If this is an urgent matter, I suggest you use a non-Alpine Docker container for the time being.
Firstly, the configure error was caused by a missing -dev package, see the gitlab issue (the reference is in a comment, but I think it should be more prominent).
The main issue in the original question is that digest.hmac_sha256() can not be used to verify RS256 signatures. A JWT RS256 signature is a SHA256 hash of the subject encrypted with an RSA private key, which can then be verified by decrypting with the RSA public key and checking the signature. This is what crypto.verifier(sha256, ...) does.
In this regard, Thijs' previous answer is already correct.
Yet the code which is circulating and has been referenced here it nothing I would endorse. Among other issues, a fundamental problem is that regular expressions are used to (pretend to) parse JSON, which is simply not correct.
I use a better implementation for long, but just did not get around to publishing it. So now is the time, I guess.
I have just added VCL snippets from production code for JWT parsing and validation.
The example is used like so with the jwt directory in vcl_path:
include "jwt/jwt.vcl";
include "jwt/rsa_keys.vcl";
sub vcl_recv {
jwt.set(YOUR_JWT); # replace YOUR_JWT with an actual variable/header/function
call recv_jwt_validate;
# do things with jwt_payload.extract(".scope")
}
Here, the scope claim contains the data that we are actually interested in for further processing, if you want to use other claims, just rename .scope or add another jwt_payload.expect(CLAIM, ...) and then use jwt_payload.extract(CLAIM).
This example uses some vmods, which we developed and maintain in particular with JWT in mind, though not exclusively:
crypto (use gitlab mirror for issues) for RS signatures (mostly RS256)
frozen (use gitlab mirror for issues) for JSON parsing
Additionally, we use
re2 (use gitlab mirror for issues) to efficiently split the JWT into the three parts (header, payload, signature)
and taskvar from objvar (gitlab) for proper variables.
One could do without these two vmods (re2 could be replaced by the re vmod or even regsub and taskvar with headers), but they make the code more efficient and cleaner.
blobdigest (gitlab) is not contained in the example, but can be used to validate HS signtures (e.g. HS256).

boto3 cloudformation Parameter validation failed

I have created below parameters which supposed to be passed while calling cloudformation client for create SNS stack command.
pubSNSCFParameters = []
pubSNSCFParameters.append("{'ParameterKey': 'Environment','ParameterValue':'" + Constants.Env + "'}")
pubSNSCFParameters.append("{'ParameterKey':'pDisplayName','ParameterValue':'" + SNSStackName + "'}")
pubSNSCFParameters.append("{'ParameterKey':'pTopicName','ParameterValue':'" + SNSStackName + "'}")
which gives like below output:
["{'ParameterKey': 'Environment', 'ParameterValue': 'dev'}", u"{'ParameterKey': 'pDisplayName', 'ParameterValue': 'some-big-value'}", u"{'ParameterKey': 'pTopicName', 'ParameterValue': 'asome-big-value'}"]
now while I run my boto3 client to create the stack for SNS i'm getting
botocore.exceptions.ParamValidationError: Parameter validation failed:
Invalid type for parameter Parameters[0], value: {'ParameterKey': 'Environment', 'ParameterValue': 'dev'}, type: <type 'str'>, valid types: <type 'dict'>
code snippet:
with open(templatelocation + 'CFT_SNS.json', 'r') as f:
client.create_stack(StackName=stackName,
TemplateBody=f.read(),
Parameters=pubSNSCFParameters ,
Capabilities=['CAPABILITY_NAMED_IAM'],
Tags=[
{
'Key': 'CreatorName',
'Value': 'some#email.com'
},
]
)
i would imagine this as to do with datatypes of parameter, so how can I fix it?
Your parameters are string:
"{'ParameterKey': 'Environment', 'ParameterValue': 'dev'}" <-- note quotations at the beginning and end.
This is because you are appending strings to pubSNSCFParameters:
pubSNSCFParameters.append("{'ParameterKey': 'Environment','ParameterValue':'" + Constants.Env + "'}")
It should be dict:
pubSNSCFParameters.append({'ParameterKey': 'Environment','ParameterValue': Constants.Env})
Assuming Constants.Env is string.

A positional parameter cannot be found that accepts argument '+' error using SqlClient in Powershell

I am getting an error in my powershell script for a runbook on Azure:
Write-Error : A positional parameter cannot be found that accepts argument '+'.
At Test-Update-IndexesForallShardsFromShardManagerRunbook:61 char:61
+
+ CategoryInfo : InvalidArgument: (:) [Write-Error], ParameterBindingException
+ FullyQualifiedErrorId : PositionalParameterNotFound,Microsoft.PowerShell.Commands.WriteErrorCommand
Based on the logs I see on my Azure autmation account from a job that ran, I pinpointed the origin of the error in my script somewhere in the following code:
$updateStatisticSql = "UPDATE STATISTICS [$Using:tableName] ( [$Using:statName] );"
$CmdUpdateStats=new-object system.Data.SqlClient.SqlCommand($updateStatisticSql, $Conn1)
$CmdUpdateStats.CommandTimeout=1500
Try
{
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($CmdUpdateStats)
[void]$Da.fill($Ds)
}
Catch
{
# Will catch the exception here so other statistics can be processed.
Write-Error "Statistic " + $tableName + "(" + $statName + ") could not be updated. Investigate the statistic."
}
It seems after adding logging after each line, that it doesn't log after the "fill" function, so I assume something is going wrong there. But I am not seeing the relation between the error and this function. It also doesn't seem a script breaking error, since it never goes into the catch and the rest of the scripts runs fine. I also validated that the statistics are updated, even though the error I am getting.
So the error you are seeing is because you are trying to build a string using concatenation which means you have spaces and spaces are used to delimit parameters when calling cmdlets. Put all the concatenation into parens:
Write-Error ("Statistic " + $tableName + "(" + $statName + ") could not be updated. Investigate the statistic.")

Apiary - Howto change the format of cURL call

In Apiary, the cURL call to production by default is :
https://example.com/v1/findBrandCat?matchstring=&interestType=
I have to make a call in following structure:
https://example.com/v1/findBrandCat/matchstringVALUE/interestTypeVALUE
How to make it?
A URI template for the API resource can be defined as follows:
# GET /v1/findBrandCat/{matchstringValue}/{interestTypeValue}
+ Parameters
+ matchstringValue: (required, string)
+ interestTypeValue: (required, string)
+ Response 200 (application/json)

flickr.auth.getFrob by tcl (REST)

I'm trying to get response from Flickr by using Flickr API but I have no idea and no examples in TCL for that.
I wrote the following code:
#!/usr/bin/tclsh
package require rest
set flickr(auth.getFrob) {
url http://api.flickr.com/services/rest/
req_args { api_key: }
}
rest::create_interface flickr
puts [flickr::auth::getFrob -api_key ea4a4134e2821898e5e31713d2ad74fd ]
When I execute it I get this error:
invalid command name "flickr::auth::getFrob"
while executing
"flickr::auth::getFrob -api_key ea4a4134e2821898e5e31713d2ad74fd "
invoked from within
"puts [flickr::auth::getFrob -api_key ea4a4134e2821898e5e31713d2ad74fd ]"
(file "./flickr.tcl" line 17)
=====================================
I've updated the last line of the code as proposed by Johannes to:
puts [flickr::auth.getFrob -api_key ea4a4134e2821898e5e31713d2ad74fd ]
but still got the strange response:
rsp {stat fail} {{err {code 112 msg {Method "unknown" not found}} {}}}
when supposed something like:
<frob>746563215463214621</frob>
as described in Flickr API help:
auth.getFrob
It looks like the command name is
::flickr::auth.getFrob