Google pubsub to Google cloud storage - google-cloud-storage

Is it possible for a bucket in cloud storage to receive data/messages from pubSub? if yes then how??
Currently i am publishing messages to pubsub
and i want to use pull delivery type (for that i have to provide endpoint URL for the bucket, which i couldn't find anywhere)
I found this somewhere in there docs
But it didn't work.

No, sorry. GCS only accepts uploads of complete files via HTTP. You could build a small app that took incoming Pub/Sub messages and uploading them as separate GCS objects or batch them into groups of messages and upload those to GCS, but there's no such built-in functionality.
Can I ask you more about your use case? What are you trying to do?

Related

Pubsub HTTP POST?

I'm working with a service that will forward data to a URL of your choosing via HTTP POST requests.
Is there a simple way to publish to a Pubsub topic with a POST? The service I'm using (Hologram.io's Advanced Webhook Builder) can't store any files, so I can't upload a Google Cloud service account JSON key file.
Thanks,
Ryan
You have 2 challenges in your use cases:
Format
Authentication
Format
You need to customize the webhook to comply with the PubSub format. Some webhoock are enough customizable for that but it's not the case of all. If you can't customize the webhook call as PubSub expect, you need to use an intermediary layer (Cloud Functions or Cloud Run for example)
Authentication
Directly to PubSub or with an intermediary layer, the situation is the same: the requester (the webhook) needs to be authenticated and authorized to access to the Google Cloud service.
One of the bad, and possible, practice, is to set allUsers authorized to access your resources. Here an example with a PubSub topic
Don't do that. Even if you increase "your" process security by defining a schema (and thus to reject all the messages that aren't compliant with this schema), letting a resource publicly, and without authentication, accessible on the wild internet is criminal!
In the webhook context (I had this case previously in my company) I recommend you to use a static authentication (a long lived authentication header; not a short lived (1h) as a Google OAuth2 token); an API Key for example. It's not perfect, because in case of API Key leak, the bad actors will be able to use this breach for a long time (rotate as soon as you can your API Keys!), but it's safer than nothing!
I wrote a pretty old article on this use case (with ESPv2 and Cloud Run), but the principle, and the configuration, is almost the same on API Gateway, a Google Cloud manage services. In the article, I create a proxy for Cloud Run, Cloud Functions and App Engine, but you can do the same thing with PubSub by setting the correct target URL.

Send existing geolocation data from cloud server to Live Objects

I currently have a running instance in Google Cloud which stores geolocation data from an IoT device. I wanted to connect my Google Cloud server to Live Objects so that these geolocation data can be used. Is there a way to connect Google Cloud and Live Objects together, to share data? Or is there a way to send a post request from an external server script (Node.js) to Live Objects without using MQTT? The post request is just a normal HTTP post. The HTTP post will be sent using REST API.
Thanks in advance.
Yes you can push data in Live Objects by using HTTPS REST API
check https://liveobjects.orange-business.com/swagger-ui/index.html#!/Data_management_data_store/addDataMessageUsingPOST
and example here: https://liveobjects.orange-business.com/doc/html/lo_manual.html#STREAMS
cheers

Azure Media Services Encoding Job Callback to URL

Using only the REST API, I am able to upload a file to Azure Media Services from my local machine and start an encoding job. Then I need to poll the job for status to see when it is done. But, what I really want is for Azure Media Services to send a request to my callback URL when it is done. Is there way to do this?
Take a look at our Notifications features which supports WebHooks.
https://learn.microsoft.com/en-us/azure/media-services/media-services-dotnet-check-job-progress-with-webhooks
It integrates well with Azure Functions also - if you want to host your callback in Azure Functions and just leverage the WebHook trigger in there.
We have some examples of doing that up here:
https://github.com/Azure-Samples/media-services-dotnet-functions-integration/tree/master/101-notify-webhooks

get metrics for google cloud pubsub

I am using google cloud pubsub and I want to know about how to get number of outstanding, delivered and undelivered messages in pubsub. Is there any available api provided by google pubsub for this?
You'll want to look at Stackdriver Monitoring. In particular, there are metrics for Google Cloud Pub/Sub, including subscription/num_undelivered_messages and subscription/num_outstanding_messages. You can also access graphs of these properties in Stackdriver.

How to upload Files to Cloud Storage?

I have a Google Cloud Endpoints wich is using Cloud SQL to store data. I want to provide a file upload for Clients and the files should be stored in Cloud Storage but I also want to store file meta data and the file storage url in Cloud SQL.
What's the best was to do this?
Can I upload files through cloud endpoints or do I need an extra upload Servlet?
How can I update my database entities which needs a reference to the uploaded files.
Any examples on how to combine those 3 technologies?
Assuming your clients are not added to your google cloud project (which is typically the case), your users don't have write access to your GCS bucket. You can either submit files to your application and move to GCS from there (not recommended as consumes more network and CPU) or a better way is to submit to GCS directly.
To let the client write to your GCS bucket directly, you will need to either:
1. put your access key on client for write access (not recommended), if the client is used by limited trusted people.
2. generate a time-bound token and put it on the client as signed URL to upload directly.
Endpoints APIs themselves cannot do this, but you can generate the signed GCS URL at the server and get it using endpoints on client. then set it as form action (on web client, other clients have similar ways for signed upload) and submit the form to upload the file.
<form action="SIGNED_URL_FROM_ENDPOINTS" method="post" enctype="multipart/form-data">
I don't see an open-source code out there doing exactly this, but closest is this project that does generate the signed URL with a time-out (the only unintuitive part).
Best way to update the metadata in your database is to watch GCS bucket using 'Object Change Notifications'. Another way is to send the metadata to your server from client itself, which can be an endpoints call. You can also use a mix of both where the metadata goes to server using endpoints even before the the file is uploaded and the notification updates the record with confirmation that it is available to serve.