GoogleCloud Storage - Delete file? - google-cloud-storage

I can see a sample of uploading a file to the Google Cloud storage. I, however, can't find a sample of deleting a file in the cloud storage. Does deleting a file API exist?

Here's the delete API documentation for the JSON API: https://cloud.google.com/storage/docs/json_api/v1/objects/delete

def delete_blob(bucket_name, destination_blob_name):
"""Deletes a blob from the bucket."""
try:
storage_client = storage.Client(project = project)
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.delete()
log.info('Blob {} deleted.'.format(destination_blob_name))
except Exception as e:
pass

Related

How to create a bucket using the python SDK?

I'm trying to create a bucket in cloud object storage using python. I have followed the instructions in the API docs.
This is the code I'm using
COS_ENDPOINT = "https://control.cloud-object-storage.cloud.ibm.com/v2/endpoints"
# Create client
cos = ibm_boto3.client("s3",
ibm_api_key_id=COS_API_KEY_ID,
ibm_service_instance_id=COS_INSTANCE_CRN,
config=Config(signature_version="oauth"),
endpoint_url=COS_ENDPOINT
)
s3 = ibm_boto3.resource('s3')
def create_bucket(bucket_name):
print("Creating new bucket: {0}".format(bucket_name))
s3.Bucket(bucket_name).create()
return
bucket_name = 'test_bucket_442332'
create_bucket(bucket_name)
I'm getting this error - I tried setting CreateBucketConfiguration={"LocationConstraint":"us-south"}, but it doesnt seem to work
"ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to."
Resolved by going to https://cloud.ibm.com/docs/cloud-object-storage?topic=cloud-object-storage-endpoints#endpoints
And choosing the endpoint specific to the region I need. The "Endpoint" provided with the credentials, is not the actual endpoint.

Deleted a file, now can't create a file of the same name

I created a file with the following Node.js code:
const {Storage} = require('#google-cloud/storage')
var gcs = new Storage()
var bucket = gcs.bucket('bucket-name')
const file = bucket.file('filename')
// fileData is a utf8 buffer
file.save(fileData, function(err) {
console.log('Error:' + err)
})
Then, I went in through the Cloud Console and deleted the file.
I then ran the code above again, but received the error "[service account] does not have storage.objects.delete access to bucket-name/filename." So I went in and added storage.objects.delete access to the service account through IAM, but I continue to get the error.
It seems that the object is still sitting inside the bucket, and it still has the old service account access (without storage.objects.delete), but I don't see the object anywhere. Versioning is suspended on this bucket.
I have since gone through the same steps with the same bucket but using a different filename and don't see the error message. This seems to show that the new service account access is being properly applied to new files, but not to old files. This is surprising, since I'm using "Bucket Policy Only" on this bucket.
Can anyone figure out how to fix this? Thanks!
Cloud Storage Object metadata
1. await bucket
.file(filePath)
.delete({ ignoreNotFound: true });
// Deleting file with a name.
const blob = bucket.file(filePath);
2. await blob.save(fil?.buffer);
//Saving File with the same name
3. const [metadata] = await storage
.bucket(bucketName)
.file(filePath)
.getMetadata();
newDocObj.location = metadata.mediaLink;
I have used metadata.mediaLink to get the latest download link
of the uploaded file from Google Bucket Storage.

Google Cloud authorization keeps failing with Python 3 - Type is None, expected one of ('authorized_user', 'service_account')

I am trying to download a file for the first time from Google Cloud Storage.
I set the path to the googstruct.json service account key file that I downloaded from https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-python
Do need to set the authorization to Google Cloud outside the code somehow? Or is there a better "How to use Google Cloud Storage" then the one on the google site?
It seems like I am passing the wrong type to the storage_client = storage.Client()
the exception string is below.
Exception has occurred: google.auth.exceptions.DefaultCredentialsError
The file C:\Users\Cary\Documents\Programming\Python\QGIS\GoogleCloud\googstruct.json does not have a valid type.
Type is None, expected one of ('authorized_user', 'service_account').
MY PYTHON 3.7 CODE
from google.cloud import storage
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="C:\\GoogleCloud\\googstruct.json"
# Instantiates a client
storage_client = storage.Client()
bucket_name = 'structure_ssi'
destination_file_name = "C:\\Users\\18809_PIPEM.shp"
source_blob_name = '18809_PIPEM.shp'
download_blob(bucket_name, source_blob_name, destination_file_name)
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
print('Blob {} downloaded to {}.'.format(
source_blob_name,
destination_file_name
)
)
I did look at this but I cannot tell if this is my issue. I have tried both.
('Unexpected credentials type', None, 'Expected', 'service_account') with oauth2client (Python)
This error means that the Json Service Account Credentials that you are trying to use C:\\GoogleCloud\\googstruct.json are corrupt or the wrong type.
The first (or second) line in the file googstruct.json should be "type": "service_account".
Another few items to improve your code:
You do not need to use \\, just use / to make your code easier
and cleaner to read.
Load your credentials directly and do not modify environment
variables:
storage_client = storage.Client.from_service_account_json('C:/GoogleCloud/googstruct.json')
Wrap API calls in try / except. Stack traces do not impress customers. It is better to have clear, simple, easy to read error messages.

Metadata on Minio object storage

I want to add metadata to Minio object while adding the file as object to Minio object storage using python. I am able to find accessing metadata of object stored on Minio. but there is no example of adding metadata while adding file to Minio storage.
Regards,
Ritu Ranjan
Well it there is a examples at python minio client test
content_type='application/octet-stream'
metadata = {'x-amz-meta-testing': 'value'}
client.put_object(bucket_name,
object_name+'-metadata',
MB_11_reader,
MB_11,
content_type,
metadata)
The trick is that metadata dict should have keys in format
'x-amz-meta-youkey'
You can use pyminio:
from pyminio import Pyminio
pyminio_client = Pyminio.from_credentials(
endpoint='<your-minio-endpoint>', # e.g. "localhost:9000/"
access_key='<your-minio-access-key>',
secret_key='<your-minio-secret-key>'
)
metadata = {'Pyminio-is': 'Awesome'}
pyminio_client.put_file(to_path='/foo/bar/baz', file_path='/mnt/some_file', metadata=metadata)
Its automaticly strips off the'x-amz-meta-' from the name of the variables so its more easy to use with pyminio_client.get('/foo/bar/baz')

Unable to download Google Cloud Storage file from the trainer application in Cloud ML Engine

I'm trying to download a file from Cloud Storage from my trainer application which runs in Cloud ML engine. However I'm getting the following error when I try to download the file.
I do have access to the Cloud Storage path.
Error:
blob.download_to_filename(destination_file_name) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 482, in download_to_filename self.download_to_file(file_obj, client=client) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 464, in download_to_file self._do_download(transport, file_obj, download_url, headers) File "/root/.local/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 418, in _do_download download.consume(transport) File "/root/.local/lib/python2.7/site-packages/google/resumable_media/requests/download.py", line 101, in consume self._write_to_stream(result) File "/root/.local/lib/python2.7/site-packages/google/resumable_media/requests/download.py", line 62, in _write_to_stream with response: AttributeError: __exit__
Here is the code for downloading the GCS file:
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(key)
blob.download_to_filename(destination_file_name)
I'm not providing any GCP credentials to Client as the trainer application running can access other files using
tf.train.string_input_producer
Any help would be appreciated.
The trainer application accesses other files via TensorFlow's file_io module. This post has a few tips, but if you want to open a file:
with file_io.FileIO("gs://my_bucket/myfile") as f:
f.read()
There is also a copy(..) function and read_file_to_string(...), if those fit your needs better.