Scenario: there are multiple folders and many files stored in storage bucket related to dcm API.(click,impression,daily aggregate files etc).
https://console.cloud.google.com/storage/browser/dcmaccountno
Is it possible to download files using rest api and currently i have service account and private key.
we dont have much exposure towards goog cloud storage hence any small help would be really appreciable.
Thank you for any help!
You can do calls to whichever of the two REST APIs: JSON or XML. In any case, you will need to get an authorization access token from OAuth 2.0 as detailed in the documentation and then use cURL with a GET Object Request:
JSON API:
curl -X GET \
-H "Authorization: Bearer [OAUTH2_TOKEN]" \
-o "[SAVE_TO_LOCATION]" \
"https://www.googleapis.com/storage/v1/b/[BUCKET_NAME]/o/[OBJECT_NAME]?alt=media"
XML API:
curl -X GET \
-H "Authorization: Bearer [OAUTH2_TOKEN]" \
-o "[SAVE_TO_LOCATION]" \
"https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]"
Note that for multiple files, you will have to program the requests, so if you want to easily download all the objects in a bucket or subdirectory, it's better to use gsutil instead.
Using rest API you can download/upload files from google storage in the following way that I already did in the below-mentioned link:
Reference: https://stackoverflow.com/a/53955058/4345389
Instead of UploadData method of WebClient, you can use DownloadData method in the following way:
// Create a new WebClient instance.
using (WebClient client= new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
// Download the Web resource and save it into a data buffer.
byte[] bytes = client.DownloadData(body.SourceUrl);
MemoryStream memoryStream = new MemoryStream(bytes);
// TODO write further funcitonality to write the memorystream
}
Do some tweaks as per your requirements.
Related
I am new to Cloud and I've been practising it for a while. I have a use case.
I want to retrieve the metadata of images in the bucket through a suitable REST API. I searched in API explorer and found Cloud Resource Manager API that could help retrieve the metadata but after exploring it I couldn't figure out the link.
API Required Fields
Can someone help me to understand what to put in the parent field?
You are using the wrong REST API.
The REST API endpoint is:
https://storage.googleapis.com/storage/v1/b/BUCKET_NAME/o/OBJECT_NAME
To view metadata, here is an example using the REST API in curl:
gcloud auth application-default login
BUCKET=BUCKET_NAME
OBJECT=OBJECT_NAME
URL=https://storage.googleapis.com/storage/v1/b/$BUCKET/o/$OBJECT
TOKEN=$(gcloud auth application-default print-access-token)
curl -v -X GET -H "Authorization: Bearer $TOKEN" $URL
I am trying to create an Azure data factory to copy data from an API to blob storage. The problem I'm encountering is getting the authorization to work. The API requires a token whose value we already have. The curl for the API is:
curl -X GET "https://zentracloud.com/api/v3/get_env_model_data/?device_sn=<value>&model_type=<value>&port_num=<value>
&inputs=<value>" -H "accept: application/json" -H "Authorization: Token <token>"
I've tried putting the Authorization into the auth headers of the Linked Service
and in the additional headers of the source of the Copy Data task
When I click "Preview Data" I get an "invalid credentials" error, which tells me either I'm not putting the authentication headers in the right place or my format is incorrect. I'm not sure how to get this thing to work.
I contacted Microsoft and was told that they don't support validation via a token.
Using an API gateway, I created an S3 bucket to copy an image (image/jpg). This website describes how I can upload images using Amazon's API gateway: https://aws.amazon.com/premiumsupport/knowledge-center/api-gateway-upload-image-s3/.
When I type the URL by adding the bucket and object name, I get the following error:
{"message":"Missing Authentication Token"}
I would like to know, first of all, where API can find my image? How can I verify that the image is a local file as stated in the introduction? In addition to that, should I use the curl command to complete the transformation? I am unable to use Potsman.
I have included a note from the link, how can I change the type of header in a put request?
What is the best way to copy files from the API gateway to S3 without using the Lambda function?
Assuming you set up your API and bucket following the instructions in the link, you upload a local file to your bucket using curl command as below.
curl -X PUT -H "Accept: image/jpeg" https://my-api-id.execute-api.us-east-2.amazonaws.com/my-stage-name/my-bucket-name/my-local-file.jpg
Note that the header indicates it will accept jpeg files only. Depending on the file you are uploading to S3, you will change/add this header.
To answer the questions directly:
What is the best way to copy files from the API gateway to S3 without using the Lambda function? - follow steps in this link https://aws.amazon.com/premiumsupport/knowledge-center/api-gateway-upload-image-s3/
where API can find my image? - your image is located in your local host/computer. You use curl command to upload it via the API that you created
should I use the curl command to complete the transformation? - not sure what you meant by transformation. But you use curl command to upload the image file to your S3 bucket via API Gateway
how can I change the type of header in a put request? - you use -H to add headers in your curl command
I want to trigger the google data fusion instance with the command as follows
POST -H "Authorization: Bearer ${AUTH_TOKEN}" "${CDAP_ENDPOINT}/v3/namespaces/namespace-id/apps/pipeline-name/workflows/DataPipelineWorkflow/start"
but I cant able to figure out one thing that what will be the CDAP_ENDPOINT in this kindly help me out by telling that from where i can find out the cdap_endpoint
Thanks
This is nicely explained in GCP documentation, as you might be able to get Data fusion API endpoint throughout the following command lines, invoking them in cloud shell:
export INSTANCE_ID=your-Data fusion instance-id
export CDAP_ENDPOINT=$(gcloud beta data-fusion instances describe \
--location=us-central1 \
--format="value(apiEndpoint)" \
${INSTANCE_ID})
Is it possible to wget / curl protected files from Google Cloud Storage without making them public? I don't mind a fixed predefined token. I just want to avoid the case where my public file gets leeched, costing me good dollars.
Another way, if as you say you don't mind getting a token externally, is to use curl to set the 'Authorization' header in your call to GCS like so:
curl -H "Authorization: Bearer 1/fFBGRNJru1FQd44AzqT3Zg" https://www.googleapis.com/storage/v1/b/bucket/o/object?alt=media
The 'alt=media' query string parameter is necessary to download the object data directly instead of receiving a JSON response.
You can copy and paste the token obtained by authorizing with the Cloud Storage JSON API separately in the OAuth 2.0 Playground.
See also:
https://cloud.google.com/storage/docs/access-control
https://cloud.google.com/storage/docs/json_api/v1/objects/get
You can use Signed URLs. This allows you to create a signed URL that can be used to download an object without additional authentication.
You could also use the curlwget chrome extension so whenever you download something on Chrome it will create the url with headers etc to allow you to wget your file.