I'm talking about "URL Fetch calls" and "URL Fetch data received" in
in https://developers.google.com/apps-script/guides/services/quotas
Community Connectors are created in Apps Scripts and all Apps Scripts quotas apply to them. This is the quota for your connector communicating to the data source and does not affect the communication between the connector and Data Studio.
Related
I want to use ready made kafka connector for fetching the data from the REST API. I found kafka-connect-http connector on the confluent hub but this connector does not support pre-authentication of the API.
I raised this as an issue in the (https://github.com/castorm/kafka-connect-http) and got the response that unfortunately this feature is not supported in the existing code of the connector. So if you have the implementation of the API without authentication then this is the readymade solution for you else you can go for streams etc.
Although the author had agreed that he will look into this feature in the coming future.
Google Apps Script JDBC doesn't support a connection to PostgreSQL directly but Google Data Studio supports a connection to PostgreSQL to pull data and build reports. I've also heard they support a low-key export to .csv option. Is it then possible to exploit the Data Studio Service in Google Apps Script to populate Google Sheets with that data, effectively creating a workaround?
All I need is a one-way access from PostgreSQL into Google Sheets by means of Google Apps Script, I do NOT expect to import anything back into my database.
Looking at the reference documentation, the built-in Apps Script service for DataStudio does not allow you to pull data from a connected data source. It can be used to create connectors but its does not allow direct access to connected data sources.
However, you can try creating a custom API or server-less mirco-service in a language that supports PostgreSQL, and then expose that service as HTTP endpoints that you can call via URLFetchApp. You can leverage Google Cloud Functions to do this and write the mirco-service in either back-end Javascript(Node.js), Python or Go. This approach will take you well-outside the bounds of your typical GAS script, but it is a viable option.
we have running applications on nodeJS and want to integrate the Logs into our activity tracker with logDNA. how can we implement this?
The activity tracker is for the ibm cloud to write records of the stuff it does on your behalf
From https://cloud.ibm.com/docs/services/Activity-Tracker-with-LogDNA?topic=logdnaat-getting-started#getting-started
IBM Cloud Activity Tracker with LogDNA collects and stores audit records for API calls made to resources that run in the IBM Cloud.
You can not contribute arbitrary content to the activity tracker.
You can log into logdna, see https://cloud.ibm.com/docs/services/Log-Analysis-with-LogDNA?topic=LogDNA-ingest
I am using a JBoss based vault to secure sensitive data such as the database credentials.
I use a Java based HTTP REST client to create distributed Kafka connectors but ended up with a security concern such that a request for the connector's "config" exposes the sensitive credentials in the response.
I referred this official documentation but could not get much help in the context of JBoss vault.
Any pointers or references that directly addresses this specific problem is very much appreciated.
Any references to alternate open source (and free to use) Vault based solutions would also be of great help.
You'd have to write code that implements the ConfigProvider interface of the Connect API, mentioned there.
You can browse Kafka source code on Github to see the existing File one, but that KIP (which references Hashicorp Vault) and the source files are the only such documentation for now.
Connect doesn't use JBoss, either, so you'd have to find a way around that
Is there a system status page for Google Cloud Platform services? When experiencing issues where can we look for system status information?
For example, i changed the tier for a Google Cloud SQL instance and it went inaccessible for 12h. The next day the same operation took a couple of minutes as expected.
I found https://code.google.com/status/appengine but not for other products (e.g. Google Cloud SQL).
The Cloud Status Dashboard is the canonical resource for Google Cloud Platform service status, and it does provide information for Cloud SQL.
there is an experimental status page at [1] but actually is provided for test purposes only.
I suggest you to subscribe to Google Groups as they are updated with issues reports, Cloud SQL announce group is available at [2].
Regards
Paolo
Link:
[1] - https://status.cloud.google.com/
[2] - https://groups.google.com/forum/#!topic/google-cloud-sql-announce/