Multiple Authorization types with AWS AppSync - aws-appsync

It seems as though an AppSync project can only be configured with one Authorization type (API_KEY, AWS_IAM, etc.). I'm using AMAZON_COGNITO_USER_POOLS as my primary type, but I also have a (Node.js) client that I want to provision with API_KEY access.
Is this possible?
If not, can you suggest any alternatives?

The answer by Rohan works provided you don't have subscriptions; if you do have a subscription in one AppSync endpoint and mutate data in another AppSync endpoint then while the data behind the scenes is updated, the subscription won't update (which makes sense, as the subscription is a attached as a listener within an AppSync endpoint). Until AppSync supports multiple methods you might want to give IAM a try; there's some details here on how to get it to work with Cognito in app + a Lambda. The example there is in python but for node.js you would generate signatures with something like https://www.npmjs.com/package/aws4 . The same method would work if running your node.js client elsewhere provided you generate some API keys

There are two approaches to solve for your use case.
You can provision a separate AppSync endpoint (you can create up to 25 per region within an AWS account) with the same schema and configure it with a different authorization scheme. Use this approach only if you need hard isolation between the endpoints.
As of May 2019, AWS AppSync supports multiple authorization schemes for a GraphQL API. You can enable AMAZON_COGNITO_USER_POOLS as the default auth scheme and API_KEY as the additional auth scheme. This is the recommended approach and also works with subscriptions, which addresses Matthew’s concern in another answer.

As of May 2019, AWS AppSync announced the support for multiple auth types in the same API. https://aws.amazon.com/blogs/mobile/using-multiple-authorization-types-with-aws-appsync-graphql-apis/

Related

Pubsub HTTP POST?

I'm working with a service that will forward data to a URL of your choosing via HTTP POST requests.
Is there a simple way to publish to a Pubsub topic with a POST? The service I'm using (Hologram.io's Advanced Webhook Builder) can't store any files, so I can't upload a Google Cloud service account JSON key file.
Thanks,
Ryan
You have 2 challenges in your use cases:
Format
Authentication
Format
You need to customize the webhook to comply with the PubSub format. Some webhoock are enough customizable for that but it's not the case of all. If you can't customize the webhook call as PubSub expect, you need to use an intermediary layer (Cloud Functions or Cloud Run for example)
Authentication
Directly to PubSub or with an intermediary layer, the situation is the same: the requester (the webhook) needs to be authenticated and authorized to access to the Google Cloud service.
One of the bad, and possible, practice, is to set allUsers authorized to access your resources. Here an example with a PubSub topic
Don't do that. Even if you increase "your" process security by defining a schema (and thus to reject all the messages that aren't compliant with this schema), letting a resource publicly, and without authentication, accessible on the wild internet is criminal!
In the webhook context (I had this case previously in my company) I recommend you to use a static authentication (a long lived authentication header; not a short lived (1h) as a Google OAuth2 token); an API Key for example. It's not perfect, because in case of API Key leak, the bad actors will be able to use this breach for a long time (rotate as soon as you can your API Keys!), but it's safer than nothing!
I wrote a pretty old article on this use case (with ESPv2 and Cloud Run), but the principle, and the configuration, is almost the same on API Gateway, a Google Cloud manage services. In the article, I create a proxy for Cloud Run, Cloud Functions and App Engine, but you can do the same thing with PubSub by setting the correct target URL.

Better key management for API keys with AWS API gateway

I am working on an API project and it is going to be exposed to members of our system. We need to secure our API using API key.
When the API is invoked, we need to know which member is invoking the API.
So we need to map the key with the member some how.
I am wondering what is the best way to manage the keys. Following are the options we are aware of
For each member, keep the API keys in our backend system and manage the key in our backend and once key is changed, manually update it in API gateway
Whenever key is changed in backend system, invoke the AWS API gateway rest service to update the key in API gateway
Any other options( not looking or OAuth2 like complex key management solutions)
Api Gateway API Keys are not used for authentication but identification. Read the last pargraph here. Judging from your question that you need to know which member invokes the API Gateway etc. means that you most likely need to implement an authentication system and the best candidate here is AWS Cognito. Use the right tool for the job and don't try to reinvent the wheel.

How might I apply multiple security mechanisms to a Swagger-generated REST service?

I have generated JAX-RS stubs for a REST service using Swagger and want to set up the security.
The security side is very new to me and I would like to use standards as far as possible. (In the past, for other J2EE applications, I have used Filters to handle Authentication which put User objects into a Session. As I understand it, Sessions should be avoided for REST.)
There are 4 types of user who will access the services
Customers and business partners (Authentication via oAuth or similar)
Employees (Authentication via NTLM & LDAP)
Developers (Mock authentication/authorisation of some kind)
Integration test (JUnit with pre-defined users and roles)
Is it possible to define a security mechanism which would handle all of these users?
How would I use the Swagger security directives?
Am I making this more complicated than it needs to be?
You could use an open source API gateway like Tyk? Here’s a link to some handy info on API Security in the tyk docs.
And here is a blog post that describes taking a layered approach to API Security that goes beyond the gateway.
Disclosure: I work for Tyk!

Creating an API Layer on top of Firebase Real-Time Database

I do have some data stored in my Real-Time Firebase database. I am willing to expose some of this data via a REST API to my B2B customers.
I know that Firebase is itself a REST API but its authentication mechanisms don't fit my needs. I am willing my customers to access the API with a simple API Key passed in the HTTP request headers.
To summarize, I need an API layer sitting on top of my Firebase real-time database with the following properties:
Basic Authentication via an API key passed in the HTTP request headers
Some custom logic that makes sure customers respect the API limits (maximum requests per day for example)
The only thing I can think of is implementing this layer in AWS lambda but that also sounds a bit off. From the lambda, I would have to access my Firebase database and serve that data. That seems too many network requests; something native to Firebase would be great.
Thanks,
Guven.
Why not have a simple API which provides them an Oauth token for the original firebase REST API if they have the correct Api Key
It'll be more secure as only you'll be able to make the tokens as only you'll have the service account private key. Also saves you the headache of making a whole REST API. Also the Oauth tokens expire relatively quickly so it's less of a risk than a normal key that you furnish
I personally have created my own Servlets where a user posts their data if they are authenticated using an id pass combo.
In the Servlets i use the default REST API provided by Firebase with the Oauth generated in my servlet. This way, i can have the DB security rules set to false for all writes from any client api. And the REST API and their admin sdk on my server ignore the security rules by default.
After some research, I have decided that AWS is the best platform such API related features.
Gateway API lets you setup your API interface in a matter of seconds
DynamoDB stores your API data; you can easily populate the data here
AWS Lambda lets you write the integration code between Gateway API and DynamoDB
On top of these, the platform offers these features out of the box:
Creation & handling and verification of API keys for authentication
Usage plans to make sure that API consumers don't exceed your API usage limits
Most of what I was looking for is offered in these AWS services.

Connecting to external database from Backand

I am currently working with a feature in our Ionic/Backand app that needs to access data from an external database (RedShift Cluster). I am thinking of the best way possible to accomplish this, and since we are using Backand as our backend, I thought if it was possible to make an action that could access RedShift Cluster and make SQL queries to it. Any help would be appreciated. Thanks!
Here is one option is
adding a generic lambda function to access AWS redshift,
Use AWS API Gateway to expose this lambda to Back &, (it is as secure as you want it to be, utilizing API key and AWS secret and access tokens of the IAM user)
Then calling this API in Back& server-side JS action by using $http object is a breeze