Google cloud platform data fusion instance triggering - google-cloud-data-fusion

I want to trigger the google data fusion instance with the command as follows
POST -H "Authorization: Bearer ${AUTH_TOKEN}" "${CDAP_ENDPOINT}/v3/namespaces/namespace-id/apps/pipeline-name/workflows/DataPipelineWorkflow/start"
but I cant able to figure out one thing that what will be the CDAP_ENDPOINT in this kindly help me out by telling that from where i can find out the cdap_endpoint
Thanks

This is nicely explained in GCP documentation, as you might be able to get Data fusion API endpoint throughout the following command lines, invoking them in cloud shell:
export INSTANCE_ID=your-Data fusion instance-id
export CDAP_ENDPOINT=$(gcloud beta data-fusion instances describe \
--location=us-central1 \
--format="value(apiEndpoint)" \
${INSTANCE_ID})

Related

Grafana 9.2.3 alert feature: How to export / import alerts as yml/json?

Is it possible to export all alerts of the Grafana 9.2.3 alerts to a json or yml file like all the other dashboards and datasources (and use it later in the provision process)?
I tried using grafana api, but it’s not returning anything.
Greetings and thanks in advance,
Faizan
curl -s -H "Authorization: Bearer api_token -X GET "https://URI/api/alerts" and it's returning []

Retrieve private service endpoints using the ibmcloud cli

Most services on the IBM Cloud catalog now support "private endpoints" as described at https://cloud.ibm.com/docs/account?topic=account-service-endpoints-overview
I'm writing some automation for our application and I'd like to obtain the private endpoint for my database from a simple bash script.
I found that I can create a service key and invoke ibmcloud resource service-key (NAME | ID) to list the contents of that key (which includes the service's public endpoint).
Unfortunately, I can't seem to find any command for listing the corresponding private endpoint.
In this particular case I am using the databases-for-postgresql service, but I was hoping for a general way to do this that will work across service types.
I can get the private endpoint information from the UI, so I know the service instance has one.
If I can't get the private endpoint from an existing command, can I piggy-back on the cli's session to invoke a curl command without messing with IAM?
For example, to invoke the curl command mentioned at the bottom of https://www.ibm.com/cloud/blog/introducing-private-service-endpoints-in-ibm-cloud-databases
curl -sS -XPOST "https://api.us-south.databases.cloud.ibm.com/v4/ibm/deployments/<deployment CRN/users/admin/connections/private" \
-H "Authorization: Bearer <IBM API TOKEN>"
The IBM Cloud CLI Cloud Databases plug-in provides this capability. Details about the deployment-connections command can be found here: https://cloud.ibm.com/docs/databases-cli-plugin?topic=databases-cli-plugin-cdb-reference#connections
The syntax is roughly: ibmcloud cdb deployment-connections [the-database] -e private
If you have not already installed this plug-in, you can do so with this command: ibmcloud plugin install cdb

What REST API load_source and params do I use to load CSV data from IBM COS S3 to IBM Db2 on Cloud

I have been unable to use the Db2 on Cloud REST API to load data from a file in IBM Cloud Object Storage (COS). This is preventing a hybrid integration POC.
Another user has reported similar REST API issue using the SERVER configuration, see the IBM Developer thread at https://developer.ibm.com/answers/questions/526660/how-to-use-db2-on-cloud-rest-api-to-load-data-from.html
I cannot seem to get the parameters correct, and I think the docs have errors in them for current Cloud Object Storage with HMAC keys ... such as for the endpoint to use, and whether auth_id should be the access_key_id.
I've tried a variety of data load commands, like the following, but none work. Can someone provide an example of a command that works (with any considerations/explanations for values)?
curl -H "x-amz-date: 20200112T120000Z" -H "Content-Type: application/json"
-H "Authorization: Bearer <auth_token>"
-X POST "https://dashdb-xxxx.services.eu-gb.bluemix.net:8443/dbapi/v3/load_jobs"
-d '{"load_source": "SOFTLAYER", "schema": "MDW84075",
"table": "SALES", "file_options":
{"code_page": "1208", "column_delimiter": ",",
"string_delimiter": "", "date_format": "YYYY-MM-DD", "time_format":
"HH:MM:SS", "timestamp_format": "YYYY-MM-DD HH:MM:SS",
"cde_analyze_frequency": 0 }, "cloud_source":
{"endpoint": "https://s3-api.us-geo.objectstorage.softlayer.net/auth/v2.0",
"path": "<bucket>/sales_data_test.csv", "auth_id": "<access_key_id>",
"auth_secret": "<secret_access_key>"} }'
Different attempts with the API call fail with a variety of messages, which usually do not have enough information to debug (and searches in doc/web do not find the messages); eg:
{"trace":"","errors":[{"code":"not_found", "message":"HWCBAS0030E: The
requested resource is not found in service admin.",
"target":{"type":"","name":""},"more_info":""}]}
P.S. I was able to use the DB2 on Cloud UI to load data from the file in COS S3, with the same access key values.
P.P.S. Perhaps "load_source": "SOFTLAYER" is an issue, but it is the only option that might map to an IBM cloud object storage. The API docs do not give any other option that might work with IBM COS S3.
If you use Db2 on Cloud with Cloud Object Storage with the REST API, then for LOAD the type should be S3. Both Amazon and IBM COS use the S3 protocol. Softlayer had its own SWIFT protocol before, but it is not available (anymore) for IBM COS.
Also see here for some docs on loading data using LOAD. The examples use Amazon and IBM COS, both with S3 protocol.

Send REST request to get data obtained by Postman

I'm in the process of getting some data from Salesforce in order to store them in GCP. It seems that there doesn't exit any tool that directly connects both sides, so the way I'm doing it is by using Postman to send a REST request to Salesforce and therefore getting the data. Now, my question is how I should proceed in order to store those data into Cloud Storage or BigQuery as I can't find the way to create a channel between GCP and Postman (if that is the right thing to do). Any advice would be much appreciated.
I think it would be best to at least code a prototype a for doing this or a python script. But you could probably use cUrl to hit the salesforce api and push the response to a local file and use the cloud tools CLI (see example from docs) to then send it to Cloud Storage. bearing in mind the results from the api call to SF would be in the raw json format. You can probably combine the different commands into a single bash script to make running end to end repeatable once you have the individual commands working correctly
curl https://***instance_name***.salesforce.com/services/data/v42.0/ -H "Authorization: Bearer access_token_from_auth_call" > response.txt
gsutil cp ./response.txt gs://your-gs-bucket

How to retrieve the APIKEY for a Compose database instance on Bluemix?

The Compose documentation describes how a backup can be initiated:
export APIKEY=your_apikey_here
export DEPLOYMENT=your_deployment_id
curl -X POST -H "Authorization: Bearer $APIKEY" -H "Content-Type: application/json" "https://api.compose.io/2016-07/deployments/${DEPLOYMENT}/backups"
I would like to execute this API call against a 'Standard' Compose database running on Bluemix (not Compose Enterprise).
The DEPLOYMENT_ID is available in the VCAP_SERVICES json, but I can't find the APIKEY - where can I find this?
The docs you refer to are specifically for the Compose native API and not for the IBM Cloud (now) Compose API.
The IBM Cloud Compose API arrived in public in November - https://www.ibm.com/blogs/bluemix/2017/11/opening-the-compose-api-on-the-ibm-cloud/.
Consult https://www.compose.com/articles/the-ibm-cloud-compose-api/ for details on how to use it and where to retrieve appropriate IBM Cloud API keys, which IBM Cloud endpoints to use and what Compose API calls are available (Spoiler: Yes, you can do on demand backups from there).