I generated Access & Secret Key put it into Bucket Explorer just as I saw on their screenshots but I get an error that Keys are not recognized on AWS, so it keeps trying to connect to AWS. I have latest version for OS X.
Looking at the Bucket Explorer website, it sounds like you need a special version of their software -- however, the download link does not actually seem to have that version available. Have you contact the developers of Bucket Explorer about it?
Alternatively, you can use the Google Cloud Console to upload and access your data in Google Cloud Storage without depending on a particular client application.
Related
I'm trying to use spanner emulator and I followed the instructions reported here:
https://cloud.google.com/spanner/docs/emulator
to set it up.
I can use gcloud shell commands to create instance, database, tables, etc. and all works correctly.
But I am not able to use the Rest API directly to access the emulator, even though the google documentation reports it is possible not only to use google client libs to access the emulator, but also the rest api.
The first problem is that it is not clear if I have to use either the base URL
https://spanner.googleapis.com/
or
http://localhost:9020/
When I try with
http://localhost:9020/v1/parent=projects/local-project/instanceConfigs
I always returns and "Not found" message, which means the the Rest API ws are responding, but through gcloud commands I can manage such instance and project!
What am I wrong with?!
According to the official documentation :
Using the Cloud Spanner
Emulator
"The Cloud SDK provides a local, in-memory emulator, which you can
use to develop and test your applications for free without creating a
GCP Project or a billing account."
Therefore you should use the localhost (localhost:9020 for REST requests).
You should use http://localhost:9020 to access the emulator if you want to manually access the REST API, so you were on the right track there.
The URL should however be http://localhost:9020/v1/projects/test-project/instanceConfigs to list all instance configurations on the emulator. Use http://localhost:9020/v1/projects/test-project/instances to list all instances.
Is there a way to directly load / edit / save files to a given bucket in Google Cloud Storage without having to download the file, edit it, and then upload it again?
We have a GCS bucket with about 20 config files that we edit for various reasons. We would really just like to load the bucket into VS Code and then browse between updating the files and saving edits.
I have tried the vscode-bucket-explorer extension for VS Code but this just seems to provide viewing capability with no editing/saving capability. Unless I am missing something?
Is there a way to mount a bucket as a drive on a Mac? With read/write ability?
Is there a way to directly load / edit / save files to a given bucket
in Google Cloud Storage without having to download the file edit it and then upload it again
No, blobs objects in Google Cloud Storage can not be edited in place.
As with buckets, existing objects cannot be directly renamed. Instead,
you can copy an object, give the copied version the desired name, and
delete the original version of the object. See Renaming an object for
a step-by-step guide, including instructions for tools like gsutil and
the Google Cloud Console, which handle the renaming process
automatically.
Is there a way to mount a bucket as a drive on a Mac? With read/write
ability?
You can use Cloud Storage FUSE where the mounted bucket will behave similarly to a persistent disk.
Cloud Storage FUSE is an open source FUSE adapter that allows you to
mount Cloud Storage buckets as file systems on Linux or macOS systems.
It also provides a way for applications to upload and download Cloud
Storage objects using standard file system semantics. Cloud Storage
FUSE can be run anywhere with connectivity to Cloud Storage, including
Google Compute Engine VMs or on-premises systems
I have a golang program running on my local laptop. I had a previous Google Cloud account that I used to upload images to a bucket (using gcloud local context). It worked.
I created another company branded Google Cloud account and linked our company card to it.
Ever since then I get the error in the title.
I contacted support and got this:
Hi,
Unfortunately we are not able to identify any abuse related actions taken on your project. To resolve this issue, please reach out to the Google Cloud Platform community support.
Sincerely,
Google Cloud Platform/API Trust & Safety Team
Do you guys have any ideas? Any help would be much appreciated.
The problem turned out to be that the JWT json file being used by my project for authentication was out-of-date.
I had to re-download a new JWT json file from the cloud console and put it in the root of my project folder.
I have an old GAE application (in production since 2011) that use Cloud Storage service since it is available in beta. I have not touched this apps for almost 1 year now.
I have to do some administrative task and want to create new buckets in Cloud Storage.
I have activated
- the application in Google Cloud Console
- the billing for this application in Google Cloud Console
I see the Google Cloud Storage tab, when I click on it it is empty, and when I try to create a new bucket I see the error message:
The account for the specified project has been disabled.
How can I fix that?
Why I do not see my existing bucket? (created long time ago using the old Web interface)
Thanks!
This can happen when the Cloud Storage service isn't turned on for your project. Do the following:
Visit http://cloud.google.com/console
Select your project
Visit the APIs & Auth tab
Find Google Cloud Storage in the list of services
Turn on Google Cloud Storage
I was facing the same issue .While every API was enabled after deep diving i realized there is some problem relating to my billing.
I installed the CKAN from source and trying to activate the Cloud Filestore option without success.
I double-checked my Google API console and activated interoperable access keys (GOOG...) to no avail. I keep getting "Unable to Upload File" when I try to upload.
Couldn't get it to work so just switched to S3 which was more straightforward. The recent S3 price drops helped too :)