How to add AppSync backend to AWS MobileHub project via console? - aws-mobilehub

Although awsmobile-cli has a feature for enabling and configuring an AppSync backend like:
awsmobile appsync enable
awsmobile appsync configure
It is prone to end up with a total irrelevant configuration: It creates DynamoDB tables in us-west-2 (Oregon), even if my project is located at eu-central-1 (Frankfurt). And it does so through its default "events" graphql schema. And after all, it does not appear on the MobileHub project console as a backend feature.
Now, the thing I want to do is adding an AppSync backend to AWS MobileHub project via the console. And then I can pull the changes from the cli once I am done i.e. modified the my graphql schema, attached the resolvers and engaged the datasources.
Is it possible as of now?

Unfortunately right now this is not possible via the Mobile Hub console. It is in the CLI roadmap to support importing existing AppSync resources.

As it is not possible to on Mobile Hub right now you could try to use serverless framework together with serverless-appsync-plugin. It allows you to write your infrastructure as code and to deploy it to AWS via CLI.
While Mobile Hub is kinda limiting, you can actually design more complex backend for your app with serverless tool. You can even set up lambda data sources for appsync. Here you can find some examples for different graphql API setups: https://github.com/serverless/serverless-graphql
If you have more or less complex schema it is a right solution to deploy it from CLI as AppSync console starts to lag with big schemas

Related

Can I get the Swagger interface to appear on a deployed Azure web API?

When one creates an ASP.NET Core Web API in Visual Studio 2022, and tests it locally, one gets a convenient Swagger page built upon an OpenAPI definition, to test all HTTP endpoints.
However, when deployed and trying to access {path-to-api}/swagger, it returns a 404 Not Found error, even while on localhost, when both the API and the database is sitting on my own machine. Even if the database is in the Azure cloud, for that matter, it also works, if I put the Azure SQL Database connection string into appsettings.json.
So is there a way to achieve this, preferably without too much hassle? Or am I wrong in wanting this, do developers mostly test their APIs locally? Because I want the Swagger API online only for testing.
The problem is getting and using the swagger functionality into the cloud. Is it possible and good practice?
If you look at the startup, you will notice that the swagger is only loaded during a development session via an if check. Commenting that out, or expanding it based on evironment, will allow a published version to generate the page on the target host.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
I generally do that for first publishes or to Dev/Test environments to see it running. Once it is not needed, I un-comment it back in.
Also it may be actually viable (turned on) in Dev or UAT server because one is also publishing the open api it to APIM (Azure api manager), which takes the api and generates its own development environment; away from an initial publish.
Also once published, it is not the default page, one still has to path to it such as .../swagger/index.html.
I'm aborting this mission to deploy the Swagger interface to Azure along with my API. It's bad security practice to make the HTTP request methods so visually available to all. So the answer to my question do developers mostly test their APIs locally, is apparently yes.
I wondered if I should remove the question, but I would like to make it still stand, in case anyone else is contemplating about doing the same thing - exposing an API online with the Swagger UI.

Eclipse Google Cloud Tools: Where is Datastore running?

When running a Java Google App Engine through Eclipse (with Google Cloud Tools) I can inspect my Datastore through the admin dashboard (localhost:8080/_ah/admin/datastore).
Is it possible to access the Datastore Rest API? Where would I be able to do that? Is it running on the same port under a different path?
It looks like Eclipse is starting up dev_appserver.py and with this you can't use Datastore API.
I've never used the Datastore emulator, but that might allow you to use the API.
Another option is to use the live Datastore API for a test GAE project.
If you want to know the port in which the Datastore emulator is listening to calls is by default 8081 and can be changed with gcloud beta emulators datastore start --host-port=localhost:8081
Alternatively, if you want to manually access the Datastore API of your GCP project you can:
Use the Datastore Dashboard in the Cloud Console
Manually make use of the Datastore API as in Try this API feature.
Alternatively

Can AWS Toolkit in Eclipse be used with localstack?

For local development, I was hoping to set up a localhost profile for AWS Toolkit that I could then use in Eclipse to interact with resources on localstack, but I'm at a loss to set this up. There is a local(localhost) option in AWS Toolkit, but I don't see how it would know what endpoints to access for the various services in localstack.
It seems like a relatively logical thing to want to do, or do I have to do all my interaction with the aws (or awslocal) cli?

REST endpoint registration and bootstrap(Creating range-index) using U Deploy

I have my code in Git repository. I am using UDeploy to deploy my code into MarkLogic environment. I can able to move all my modules successfully but facing two problems
1. Creating New indexes
2. REST endpoint creation
Please let me know if there is anyway to implement these two
For creating indexes, I have tried to do it using API functions(admin:database-range-element-index()) and I have successful in that part. But is there any way to do it from UDeploy or DevOps.
For register REST endpoint I couldn't able to find anyway to try.
Have you looked at MarkLogic's REST Management APIs - https://docs.marklogic.com/REST/management. In particular, see if https://docs.marklogic.com/REST/POST/manage/v2/databases will help you create indexes via REST management APIs.
The most common way to deploy MarkLogic code & configuration is ml-gradle, a plugin to the widely used gradle tool. ml-gradle uses MarkLogic's Management API, mentioned by Ganesh, and is scriptable.

Google Cloud Sdk from DataProc Cluster

What is the right way to use/install python google cloud apis such as pub-sub from a google-dataproc cluster? For example if im using zeppelin/pyspark on the cluster and i want to use the pub-sub api, how should i prepare it?
It is unclear to me what is installed and what is not installed during default cluster provisioning and if/how I should try to install python libraries for google cloud apis.
I realise additionally there may be scopes/authentication to setup.
To be clear, I can use the apis locally but I am not sure what is the cleanest way to make the apis accessible from the cluster and I dont want to perform any unnecessary steps.
In general, at the moment, you need to bring your own client libraries for the various Google APIs unless using the Google Cloud Storage connector or BigQuery connector from Java or via RDD methods in PySpark which automatically delegate into the Java implementations.
For authentication, you should simply use --scopes https://www.googleapis.com/auth/pubsub and/or --scopes https://www.googleapis.com/auth/cloud-platform and the service account on the Dataproc cluster's VMs will be able to authenticate to use PubSub via the default installed credentials flow.