Apiary Error when invoking API - apiary.io

I get the following error on ApiaryWe are sorry, but the API call failed.
My host is defined as
FORMAT: 1A
HOST: https://test.mynetwork.com/
GET CALL IS DEFINED AS
data urn [models/v2/files/{urn}/data{?guid}]
GET [GET]
Parameters
urn (required, string, ttt)...design urn.
guid (optional, string,067e6162-3b6f-4ae2-a171-2470b63dfe02)...filter by guid.
Response 200 (application/vnd.api+json)
Body
data
{
"version": "1.0",
}
When i invoke this , i get error . Any inputs

I have edited your API Blueprint as follows:
FORMAT: 1A
HOST: https://test.mynetwork.com
# Test API
This is a test API
## data urn [/models/v2/files/{urn}/data{?guid}]
+ Parameters
+ urn: `ttt` (required, string) - design urn
+ guid: `067e6162-3b6f-4ae2-a171-2470b63dfe02` (optional, string) - filter by guid
### test [GET]
+ Response 200 (application/vnd.api+json)
{
"version": "1.0"
}
You can find tutorials here: https://help.apiary.io/
Also not sure what you mean "invoke" the API - are you calling the mock server or are you hitting your own server through Apiary.
Happy to help further via the support option in Apiary itself or email us on support [at] apiary.io.

Related

AWS HTTP API Integration with AWS Step Functions -> Sending Multiple Values in the Input

I have a Type: AWS::Serverless::HttpApi which I am trying to connect to a Type: AWS::Serverless::StateMachine as a trigger. Meaning the HTTP API would trigger the Step Function state machine.
I can get it working, by only specifying a single input. For example, the DefinitionBody when it works, looks like this:
DefinitionBody:
info:
version: '1.0'
title:
Ref: AWS::StackName
paths:
"/github/secret":
post:
responses:
default:
description: "Default response for POST /"
x-amazon-apigateway-integration:
integrationSubtype: "StepFunctions-StartExecution"
credentials:
Fn::GetAtt: [StepFunctionsApiRole, Arn]
requestParameters:
Input: $request.body
StateMachineArn: !Ref SecretScannerStateMachine
payloadFormatVersion: "1.0"
type: "aws_proxy"
connectionType: "INTERNET"
timeoutInMillis: 30000
openapi: 3.0.1
x-amazon-apigateway-importexport-version: "1.0"
Take note of the following line: Input: $request.body. I am only specifying the $request.body.
However, I need to be able to send the $request.body and $request.header.X-Hub-Signature-256. I need to send BOTH these values to my state machine as an input.
I have tried so many different ways. For example:
Input: " { body: $request.body, header: $request.header.X-Hub-Signature-256 }"
and
$request.body
$request.header.X-Hub-Signature-256
and
Input: $request
I get different errors each time, but this is the main one:
Warnings found during import: Unable to create integration for resource at path 'POST /github/secret': Invalid selection expression specified: Validation Result: warnings : [], errors : [Invalid source: $request specified for destination: Input].
Any help on how to pass multiple values would so be appreciated.

How to access Namespaced JWT Claims in AWS HTTP API Gateway Request Mapping

I'm trying to set up HTTP integration in AWS API Gateway v2 (aka HTTP API). In my config, I have a native JWT authorizer and want to append one namespaced JWT access_token claims to HTTP request headers.
As long as claims as simple name such as sub or iss this is working fine with the following mapping syntax:
append:header.simple = append:$context.authorizer.claims.simple
However, some of my claims are namespace with an https://namespace/ prefix (is a requirement from Auth0 and cannot be changed). This is where mapper syntax is falling short for me.
Say my input JWT is like this:
{
"aud": "my.dev.api",
"azp": "CCCC",
"exp": "1610606942",
"https://my.ns/account_no": "100368421",
"iat": "1610598342",
"iss": "https://mytenant.auth0.com/",
"scope": "openid profile email account:admin",
"sub": "auth0|user-id"
}
How can I map namespaced claim https://my.ns/account_no?
I tried $context.authorizer.claims['https://my.ns/account_no'] with no luck. Here is the terraform setup I use:
resource "aws_apigatewayv2_integration" "root" {
api_id = aws_apigatewayv2_api.api.id
integration_type = "HTTP_PROXY"
connection_type = "INTERNET"
description = "This is our GET / integration"
integration_method = "GET"
integration_uri = "http://${aws_lb.ecs_lb.dns_name}"
passthrough_behavior = "WHEN_NO_MATCH"
request_parameters = {
"append:header.account_no" = "$context.authorizer.claims['https://my.ns/account_no']" <-- FAILING HERE
}
}
Error I'm getting in terraform and dashboard is the same:
Invalid mapping expression specified: Validation Result: warnings : [], errors : [Invalid mapping expression specified: $context.authorizer.claims["https://my.ns/account_no"]]
Thanks for your assistance.

GRPC Repeated field does not transcode to an array as body parameter in REST API

I´m having little luck trying to send a PUT request with JSON containing an array of objects to my GRPC Server using REST. Using GRPC however it accepts an array just like expected. This is what I have defined in my proto file:
message UpdateRequest {
repeated Data data = 1;
int32 Id = 2;
}
message UpdateResponse {
}
message Data {
int32 id = 1;
string name = 2;
}
rpc Update(UpdateRequest) returns (UpdateResponse) {
option (google.api.http) = {
put: "/v1/data/{Id}"
body: "*"
};
}
This deploys successfully to GCP Endpoints but according to the GCP enpointsportal the request body is supposed to only contain a single object like:
{
"data": {
}
}
instead of an array of objects like expected:
{
"data": [
{},
{}
]
}
I´ve tried with replacing the "*" in the body with "data"
rpc Update(UpdateRequest) returns (UpdateResponse) {
option (google.api.http) = {
put: "/v1/data/{Id}"
body: "data"
};
}
This also compiles, but fails when trying to deploy to GCP endpoints with the following message:
kind: ERROR
message: "http: body field path 'data' must be a non-repeated message."
Any suggestions as to how I should go about solving this would be greatly appreciated.
Update:
Heres the contents of my .yaml file.
type: google.api.Service
config_version: 3
name: xxx.xxx-xxx.dev
title: xxxx
apis:
- name: x.x
- name: x.y
backend:
rules:
- selector: "*"
address: grpcs://xxx-xxx-app-xxxx-lz.a.run.app
This is a known issue, according to GCP support.
Here is the google issuetracker link: https://issuetracker.google.com/issues/178486575
There seems that this is a bug in GCP endpoints portal. I´m now successfully sending update requests with arrays containing object through CURL and my frontend application, although this does not work through endpoints.

Openapi3 and CSV response (for Dredd)

I test my Api with DREDD against it's specification (written in Openapi3 considering, painfull limitations of Support by Dredd considered). No I have one endpoint, which produces CSV-data if the Accept-header is set so.
'/my-endpoint':
summary: ...
description: ...
get:
# parameters:
# -
# in: header
# name: Accept
# description: "Response format: application/json or text/csv"
# example: "text/csv"
responses:
'200':
description: ...
content:
text/csv:
schema:
type: string
example:
summary: 'csv table'
value: 'cell1, cell2'
When I run the test with Dredd the test fails with
expected:
headers:
body:
[
{
"key": "summary",
"value": "csv table"
},
{
"key": "value",
"value": "cell1, cell2"
}
]
statusCode: 200
Clearly there is something misunderstood and Dredd expects still JSON. Also the API is not told to produce the CSV Version. If I commit in the Accept header in the code abvoe I get the very same result - the expecetd result above and as actual result the JSON version of the my-endpoint-data and also ad warning:
warn: API description parser warning in .../tmp/transformed.specs.yml: 'Parameter Object' 'name' in location 'header' should not be 'Accept', 'Content-Type' or 'Authorization'
I read here and here: Header parameters named Accept, Content-Type and Authorization are not allowed. To describe these headers, use the corresponding OpenAPI keywords - but what are they? According to this and this page it seems to be enough to specify a response of the given type but that is clearly not enough to tell Dredd to produce such a header.
You got an error because the value of the example key is meant to be a literal example value. So in your case it's treated as an object with the summary and value properties.
Change your definition to:
content:
text/csv:
schema:
type: string
example: 'cell1, cell2'
Or if you want to provide a summary/description for an example, use examples instead:
content:
text/csv:
schema:
type: string
examples:
csv table:
summary: A CSV table with 2 cells
value: 'cell1, cell2'

can we connect to public api endpoint instead of local host using dredd tool?

I tried to use a public end point(eg:api.openweathermap.org/data/2.5/weather?lat=35&lon=139) instead of the local host while configuring dredd and ran the command to run the tool.But I am not able to connect to the end point through dredd. It is throwing Error:getaddrINFO EAI_AGAIN .
But when I tried to connect to the endpoint using post man .I am able to connect successfully
There is no difference in calling a local or remote endpoint.
Some remote endpoints have some sort of authorization requirements.
This an example of Dredd calling external endpoint:
dredd.yml configuration file fragment
...
blueprint: doc/api.md
# endpoint: 'http://api-srv:5000'
endpoint: https://private-da275-notes69.apiary-mock.com
As you see, the only change is the endpoint on Dredd configuration file (created using Dredd init).
But, as I mention, sometimes you'll need to provide authorization through the header or query string parameter.
Dreed has hooks that allow you to change things before each transaction, for instance:
You'd like to add the apikey parameter in each URL before executing the request. This code can handle that.
hook.js
// Writing Dredd Hooks In Node.js
// Ref: http://dredd.org/en/latest/hooks-nodejs.html
var hooks = require('hooks');
hooks.beforeEach(function(transaction) {
hooks.log('before each');
// add query parameter to each transaction here
let paramToAdd = 'api-key=23456';
if (transaction.fullPath.indexOf('?') > -1)
transaction.fullPath += '&' + paramToAdd;
else
transaction.fullPath += '?' + paramToAdd;
hooks.log('before each fullpath: ' + transaction.fullPath);
});
Code at Github gist
Save this hook file anywhere in your project an than run Dredd passing the hook file.
dredd --hookfiles=./hoock.js
That's it, after execution the log will show the actual URL used in the request.
info: Configuration './dredd.yml' found, ignoring other arguments.
2018-06-25T16:57:13.243Z - info: Beginning Dredd testing...
2018-06-25T16:57:13.249Z - info: Found Hookfiles: 0=/api/scripts/dredd-hoock.js
2018-06-25T16:57:13.263Z - hook: before each
2018-06-25T16:57:13.264Z - hook: before each fullpath: /notes?api-key=23456
"/notes?api-key=23456"
2018-06-25T16:57:16.095Z - pass: GET (200) /notes duration: 2829ms
2018-06-25T16:57:16.096Z - hook: before each
2018-06-25T16:57:16.096Z - hook: before each fullpath: /notes?api-key=23456
"/notes?api-key=23456"
2018-06-25T16:57:16.788Z - pass: POST (201) /notes duration: 691ms
2018-06-25T16:57:16.788Z - hook: before each
2018-06-25T16:57:16.789Z - hook: before each fullpath: /notes/abcd1234?api-key=23456
"/notes/abcd1234?api-key=23456"
2018-06-25T16:57:17.113Z - pass: GET (200) /notes/abcd1234 duration: 323ms
2018-06-25T16:57:17.114Z - hook: before each
2018-06-25T16:57:17.114Z - hook: before each fullpath: /notes/abcd1234?api-key=23456
"/notes/abcd1234?api-key=23456"
2018-06-25T16:57:17.353Z - pass: DELETE (204) /notes/abcd1234 duration: 238ms
2018-06-25T16:57:17.354Z - hook: before each
2018-06-25T16:57:17.354Z - hook: before each fullpath: /notes/abcd1234?api-key=23456
"/notes/abcd1234?api-key=23456"
2018-06-25T16:57:17.614Z - pass: PUT (200) /notes/abcd1234 duration: 259ms
2018-06-25T16:57:17.615Z - complete: 5 passing, 0 failing, 0 errors, 0 skipped, 5 total
2018-06-25T16:57:17.616Z - complete: Tests took 4372ms