real-time data update with cloud firestore via api rest call - rest

I would like to create a Cloud Firestore API while maintaining my business rule within Cloud Functions, I would like to process these requirements in my Angular application and update the data in real time.
Is it possible to do real time updates if I create a Rest API with Cloud Firestore?

If you're building your REST API on top of Cloud Functions, this will not be possible. Cloud Functions doesn't support HTTP chunked transfer encoding, nor does it support HTTP/2 streaming that would be required to let the client keep a socket open for the duration of the listen. Instead, functions are required to send their entire response to the client at once, with a size less than 10MB, within the timeout of the function (max 9 minutes).

Related

How to call HTTP endpoint with data from Data Fusion / CDAP

I'm trying to call an HTTP endpoint (service/function) from my Data Fusion / CDAP real-time (streaming) pipeline. This HTTP endpoint serves a training machine learning model (via Google Cloud AI Platform Unified). I need to pass some data from my pipeline to this endpoint and ontain data back (i.e. send a chunk of pre-processed data and obtain classification result back to path it further in my Data Fusion / CDAP pipeline). How can I do it?
I've looked into:
HTTP Plug-ins, but they support either Sink or Source, while I need a Transform plugin (i.e. data in -> call http service -> data out);
Wrangler's invoke-http directive (https://cdap.atlassian.net/wiki/spaces/DOCS/pages/382107784/Invoke+HTTP+directive), but it does not support body formating and nested JSON (e.g. Cloud AI Platform serves machine learning models via nested JSON, also reply is nested JSON); also how to debug and handle errors there is not clear to me;
python transform plug-in, but it is restrictive in terms of importing modules when run in Interpreted mode
For your usecase, I would recommend to create a custom plugin that will use the Google Cloud AI Platform API directly, then you will have more flexibility in formating the input and output of the data.
For an example, there is a DLP plugin that communicates to Google Cloud DLP APIs: https://github.com/data-integrations/dlp-plugins

How to stream data from Firebase REST API to Beckhoffs's PLC using SSE (TwinCAT3)?

I can normally read data from Firebase Realtime Database via REST API through .GET requests. And the same applies for writing data with .PUT requests. But in the Firebase Realtime Database REST API documentation it is specified that you can also set a SSE listener (EventSource / Server-Sent Events ).
Thus far I have
Set the Accept header to "text/event-stream" as stated in the documentation (with FB_IotHttpHeaderFieldMap and its method AddField).
Set the HTTP security layer SSL (so that PLC would communicate with REST API through HTTPS as needed by the documentation).
But now I can't wrap my head around what I should do next ...
How would you approach this problem?
What is the next step into setting an SSE listener?
And if there is no built-in way to do this - is it possible to code
it by myself?
Using: TwinCAT XAE (VS 2017) on Windows 10

Read server time from AWS DynamoDb, swift

I need to get the current server time/timestamp from AWS dynamodb to my ios swift application.
In firebase db we can write the current timestamp to db and after that we can read it from the app. Any suggestion about this is appreciated.
DynamoDB does not provide any sort of server time—any timestamps must be added by the client. That being said, you can emulate a server time behavior by setting up a Lambda function or an EC2 instance as a write proxy for DynamoDB and have it add a timestamp to anything being written to DynamoDB. But it’s actually even easier than that.
AWS allows you to use API Gateway to act as a proxy to many AWS services. The process is a little long to explain in detail here, but there is an in-depth AWS blog post you can follow for setting up a proxy for DynamoDB. The short version is that you can create a rest endpoint, choose “AWS Service Proxy” as the integration type, and apply a transformation to the request that inserts the time of the request (as seen by API Gateway). The exact request mapping you set up will depend on how you want to define the REST resources and on the tables you are writing to. There is a request context variable that you can use to get the API Gateway server time. It is $context.requestTimeEpoch.

Run a web socket on Cloud Functions for Firebase?

Hello I actually have a REST api running on Cloud Functions for Firebase using http request, but now I need to sync the data on real time requesting to the functions. I read something about web sockets.
Is there a way to run a web socket on Cloud Functions for Firebase?
This is not going to be a good fit for Cloud Functions. Websockets rely on long-lived connections to the same server over time; Cloud Functions are ephemeral compute instances that are spun down when there's no traffic. There's no way to force or guarantee that a Cloud Function will keep running or hold a connection open indefinitely.
I would encourage you to investigate using the Firebase Realtime Database as a conduit here instead of trying to add realtime to Cloud Functions.
Theoretically you could use two different layers: one to manage the websocket connections and another layer to handle the data processing.
The websocket layer will not be Cloud Functions, but a Docker container running Push Pin in Cloud Run and that’ll route HTTP calls to your Cloud Functions to do the actual data processing.
This is possible using an Apigee Java callout, where the Java (if needed) calls a Cloud Function. See https://cloud.google.com/apigee/docs/api-platform/develop/how-create-java-callout

AWS API Gateway - to use with AWS EC2 Endpoint or AWS Lambda?

I need to create a API where the Vendors will push the data to the server using REST calls and this data needs to further pushed to a user on mobile app(using Websocket guessing as of now) to whom the data belongs.
For Vendors to use REST API : I need to check the Vendor credential and Write that data to DB.
I am keen to know what approach should I use ? Should I use AWS API Gateway which can help for security and scalability.
and while using AWS API Gateway - what would be a better approach to have EC2 Endpoint or Lambda Endpoint.
Using EC2 vs Lambda depends on how you want to design your services and specific use cases. Going serverless is a trend these days, but you do not need to go serverless, just for the sake of being serverless.
For your use case, If the REST API you will expose updates a Database, let's say RDS, Lambda function probably is not an ideal choice. As you will need to open a connection every time the lambda function is invoked. Moreover, if you are running the lambda in a NO VPC config, You will need to publicly expose your RDS port. If its DynamoDB, it works out well.
But again, you want to push out the update to Mobile apps over say web sockets. You definitely need a WebSocket Server somewhere, and I guess its EC2.
You may design your application in way such that all your business logic resides in the lambda functions, updates the DB, posts a message to an SQS queue. The WebSocket server can then pick up messages from the SQS queue and post updates. This decouples your application architecture. This is just one approach and wont scale horizontally out of the box.
OR - You may choose to put everything in one EC2 instance, expose a REST API that updates the DB and also posts updates to the WebSocket connection.