How to notify of object change in cloud store using gsutil without ApplicationUrl - google-cloud-storage

I would like to notify my script running on linux of an object change in a bucket.
After reading the documentation I can notify an Application through url but this is not what I am looking for.
Is there any way I may listen for an object change through gsutil in my script?

Cloud Pub/Sub is the recommended solution for getting notified of changes to a bucket. With the Cloud Pub/Sub integration, you can subscribe to changes from your script to the topic being published to.
If you want to receive the notifications from a command, you can use gcloud pubsub subscriptions pull.

Related

GCS bucket Premium File Transfer Task No Pubsub Notification

I am using the Kingsway Soft Premium File Transfer task, docs here https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task, to copy local files to a GCS bucket and it is supposed to trigger a pubsub notification however no notification happens even though the upload succeeds. If however I then go into GCS console and download the file and upload it manually, I get a pubsub notification.
Any idea why this may be?

Possible to integrate github probot with splunk?

Building my first git app and ... But i'm thinking of a github app that could post to splunk ... or integrate with the splunk github app and "POST" events to the system; wondering if this functionality is/ can be supported somehow;
The real question is - can I send selective information about the repository from a git app to splunk ?
Possible ?
There is the GitHub Addon on Splunkbase that can be used to integrate with GitHub.com or an onprem instance.
Alternatively, you can look at using Splunk HTTP Event Collector to receive formatted webhook calls and ingest data that way.

Redirect Cloud Foundry streaming logs to log providers like papertrail or others

I am trying to send logs out to an external logging platform using the following command:
cf cups activity-tracker -l https://HOST:PORT
I am assuming activity-tracker that is created by above command will send all activity or access logs automatically to an externally exposed API. But so far I am not seeing results popup in papertrail, I also tried others like splunk.
What am I doing wrong?
The process to syslog drain to papertrail is documented here. This will send application and Cloud Foundry related logs to papertrail.
ibmcloud cf cups my-logs -l syslog-tls://logsN.papertrailapp.com:XXXXX
ibmcloud cf bind-service <appname> my-logs
ibmcloud cf restart <appname>
The is no mechanism to stream the events in real-time from Activity Tracker to another endpoint. The closest solution would be downloading events and piping them to a 3rd party using a program. The Downloading Events documentation may help you to do so.

Access to audit events of UAA (User Account and authentication) events in Swisscom cloud

Is it possible to get access to events generated by User Account and Authentication (UAA) server in the context of Swisscom Application Cloud?
It is essential for me, to be able to have an audit trail of actions executed by authorised operators through the API (that would include cli and portal).
What I am looking for is an alternative of AWS CloudTrail for IAM module, that you can turn on for specific VPCs / regions there.
I have found this in the CF documentation (https://docs.cloudfoundry.org/loggregator/cc-uaa-logging.html) but that (as far as I understand it) requires infrastructure level access.
Thanks a lot for any hints.
We can't expose UAA logs to individual customers since it contains probably sensitive information about other users or the platform.
You should be able to retrieve the logs of your application in the application logs (which you can send to a syslog drain, i.e. the ELK/Elasticsearch service).
All API interactions should be covered by this log stream, according to the documentation:
Users make API calls to request changes in app state. Cloud Controller, the Cloud Foundry component responsible for the API, logs the actions that Cloud Controller takes in response.
For example:
2016-06-14T14:10:05.36-0700 [API/0] OUT Updated app with guid cdabc600-0b73-48e1-b7d2-26af2c63f933 ({"name"=>"spring-music", "instances"=>1, "memory"=>512, "environment_json"=>"PRIVATE DATA HIDDEN"})
From https://docs.cloudfoundry.org/devguide/deploy-apps/streaming-logs.html

using pact-broker webhooks locally?

I have setup the pact broker locally and able to publish the pacts which are also verified by the provider successfully. I am at the point to use webhook which kicks off a build of the provider project if the pact content has changed since the previous version. Can I use the webhook concept in my local because my consumer and provider are not configured in CI?
You'll need to create a local "CI server" on your machine. It doesn't really have to be a proper CI server, but it does have to be able to accept an HTTP request that will kick off a build somehow.
You should be able to create a very simple ruby/javascript/python HTTP server that will run the provider build in a backgrounded process when it receives a request. Or, you could install a copy of something like Jenkins locally.