How do you send GRPC metadata through HTTP REST when transcoding is used? - metadata

I have a gRPC API running in Google Cloud. I'm using Google's Extensible Service Proxy to connect it to a Google Endpoints Service. Then I enabled transcoding in the ESP so that a REST API is offered as well as a gRPC one. One thing that is important in my API is that each request is user-authenticated. In normal gRPC I'm having the user token sent with the metadata of each request along with the API key.
My question is how does this work with the transcoded REST API. How can I get the user token sent with each request?
I see that the API key which is processed by the ESP get's added to the request URL as a parameter, but what about my custom metadata, how does that get through?

I've figured it out. I just need to put the metadata in the request headers.
curl -H "authorization: Bearer token-goes-here" https:api.domain/path?key=api-key

Related

Keycloak with api gateway Invalid bearer token

I am trying to use Keycloak behind an API gateway (Apache APISIX).
I use minikube to run Keycloak and my API gateway.
The gateway is working right and Keycloak too :
With Keycloak, I can use the different end-point (use the discovery end-point (http://127.0.0.1:7070/auth/realms/myrealm/.well-known/uma2-configuration), ask an access token and verify it).
With APISIX, and a simple route, I can join a backend microservice on my minikube.
(typically : http://127.0.0.1:80/greeting is served by the gateway which routes the request to the right backend microservice)
The problem occurs when I try to use the two tools together. I have used the Keycloak integration, in order to force the user to use a valid token when he is using a route served by the gateway.
In this case, when I use a valid bearer token (I get it and verify it with the end-point of keycloak), and I try to request the backend via the api gateway with the verified bearer token, I obtain systematically an "Invalid bearer token" exception.
{"error":"invalid_grant","error_description":"Invalid bearer token"}
I think the settings of the integration is well set because I am sure that te gateway call Keycloak to verify the token.
Here are the keycloak I have used to get and verify the token :
Get token : http://127.0.0.1:7070/auth/realms/myrealm/protocol/openid-connect/token
Verify : http://127.0.0.1:7070/auth/realms/myrealm/protocol/openid-connect/token/introspect
I have seen some posts about problem when Keycloak is behind a reverse proxy, but I don't find a clear solution to my case.
Thanks for any help you can bring to me.
Regards
CG
I think there are those ways you can do it.
First, I think you can check the log of Apache APISIX.
Second, you can check the log of Keycloak.
Third, you can use tcpdump or wireshark to capture the request that Apache APISIX sends to keycloak.And diff the request that sends by APISIX and curl.
Looking forward to your reply.

Server to Server API Authentication + Authorization

I'm designing a Web API that will be consumed by external web server.
Only the external web server must be authorized to access the internal API.
The end user will be authenticated against external web server, but the username must be forwarded to Internal API when requesting data, because there is some data filtering based on the username.
What authentication mechanism should in the internal web api server?
I started with X-API-Key header, but then how should I provide username? I would like to avoid passing username in querystrings
I was thinking about basic authentication, where password would be the X-API-Key
bearer token could theoretically work as well, but bearer tokens are usually generated by authorization server, which is not an option in this case.
EDIT:
Note, that the end user does not make any API calls. It simply access a website build using some CMS and the CMS internally fetches the data and generates HTML response.

How does one use Kafka with OpenID-Connect?

I'm starting out with Kafka.
I see that I'm able to pass headers when producing messages.
Traditionally one would have a web client (single page app) where to user logs in via some remote oidc idp and receives a token. That token is then sent via Authentication: Bearer token-here header to some RESTful backend where the token is checked for validity and the payload is processed, saved to database or other and something is returned or not.
Now there's Apache Kafka. It has a REST proxy. I can pass headers to the REST proxy and produce messages, or consume them, but I'm interested in the "secure my RESTful JSON API" part.
Currently, without Kafka, I have either a oidc proxy (using keycloak, that's keycloak-gatekeeper) that does the filtering of which request makes it to the backend, or I have a oidc client that does token validation as some middleware function inside the backend. In any case invalid requests doesn't get "logged" as they would in Kafka, I assume.
Where does oidc token validation and request filtering fit in the Kafka/Confluent ecosystem?
Assume we have a SPA that talks to the Confluent REST Proxy. Some logged in user wants to post messages and some non-logged in user should not be able to.
How does Kafka and/or its tools deal with that scenario?
Kafka commonly uses SASL and other Authorization plugins to prevent access.
Certificates would be distributed amongst clients (here, that is the REST Proxy). You would need other proxies or plugins around that to prevent further access or audit the requests, as with any other web server.
HTTPS certificates would be used to secure traffic to the REST proxy, but seems you're asking about something more specific.
There is no reference to OpenID in the documentation, only LDAP RBAC, as a commercial offering

Accessing IBM API Connect endpoint through Postman

I just created an REST API in API Connect and the endpoint works when I test it in the APIC assemble tab. It requires a client id and client secret. When I send a request through Postman, I currently get a “Could not get any response” message from when I try to add them as header values or OAuth authorization. I’m using the request endpoint that’s displayed when I hit the debug button from the successful response on the Assemble tab. Is this the correct endpoint to use? How do I properly include the client id and client secret in a Postman request?
If you get a "Could not get any response in Postman", that means that Postman can't reach the destination of the request.
There are several reasons for that:
Is it an intranet or internet endpoint?
Are you using a proxy? (check proxy config)
Is the hostname resolvable? (try ip)
If it is an https
endpoint, with a self signed certificate, check if you have SSL
Certificate verification enabled (Settings-> general)
On the other hand, to send the client-id and client-secret headers, just click on Headers tab and add both (see the following picture)
Please check the below things to get access to API Connect published services.
Service needs to be allowed to invoke from postman(System from which you are invoking.)
Please check the web-api MPGW service titled in DataPower default domain created when you configure your API connect with DataPower have you created an access control list in the front-side-handler.
Please disable the SSL configuration in the postman, sometime this may create a problem(since the service exposed from API Connect will be with SSL)
From the error you are getting, I suspect there is no connection or only one-way traffic is enabled which means you are blocking response. If there is an issue with the request parameters you are sending, an error will be different saying, wrong client id or client secret.
Testing API which is on-boarded from API Connect will be straightforward or same we invoke other rest services.
Thx Srikanth
I needed to include the client id and client secret in the headers using the correct name for them, which is specified when creating/editing the api under the 'Security Definitions' category as 'Parameter Name'.
I was also hitting the wrong endpoint. To find the correct endpoint click the hamburger icon in the upper left of api connect website, select dashboard, click on the environment you want such as sandbox or dev, click settings, click gateway, then you'll see the endpoint.

Using Kong API Gateway as a proxy for Cisco UCCX

I am running Cisco UCCX 11.0 which is a Contact Center server that is based on a Java scripting engine. Scripts are build using the 'Script Editor' software where you drag elements (Java Beans) to define the script logic. One of the steps in the script is to perform a REST Call. Unfortunately this step does not support adding Custom Headers such as Authorization headers and thus is limited to Basic Authentication only.
I would like the script to make a REST Call to an external API that uses a static Bearer Token. Am I correct in saying I could use Kong Gateway for this? Here is my idea of the flow:
UCCX Makes REST Call to Kong with Basic Authentication ---> Kong Gateway recieves the request ---> Kong Gateway makes it's request to External API with static Bearer Token ---> External API responds back to Kong ---> Kong forwards the Response back to UCCX
Is this type of flow possible/easy to deploy?
This can easily be managed by assigning the Request Transformer plugin to the Kong API exposing the upstream service.
Example:
Let's assume you have an API endpoint on Kong called /myapi that is forwarding to your upstream service.
You then assign the Request Transformer plugin to the /myapi API.
For your case, you will most likely want to be using the config.add.headers option when configuring the Request Transformer plugin to add the required header authentication which will be added to all upstream requests.
Relevant Gitter Conversation:
https://gitter.im/Mashape/kong?at=587c3a9c074f7be763d686db