`--trigger-resource' description error for cloud firestore trigger - google-cloud-firestore

I am trying an example of Cloud Functions trigger based on Cloud Firestore. While deploying the function using gcloud, I am getting this error:
gcloud functions deploy hello_firestore --runtime python32
--trigger-event providers/cloud.firestore/eventTypes/document.update
--trigger-resource projects/patch-us/databases/ (default)
/documents/books/{booksid}
bash: syntax error near unexpected token `('
Can someone point out whats wrong with the command line?

It was a very stupid mistake..
gcloud functions deploy hello_firestore --runtime python32 --trigger-event providers/cloud.firestore/eventTypes/document.update --trigger-resource "projects/patch-us/databases/(default)/documents/books/{booksid}"
The path needs to be within inverted commas.

Related

gcloud logging with regular expression

I'm trying to use gcloud logging along with regular expression. My query works in the console but I can't get it going via the CLI.
gcloud logging read "resource.type=gce_instance AND protoPayload.authenticationInfo.principalEmail=~'.*#example.com.au'" --limit=10 --format=json
I get the error:
ERROR: (gcloud.logging.read) INVALID_ARGUMENT: Unparseable filter: unrecognized node at token 'MEMBER'
I've tried with and without various quotes '' "" "\"\
I also have the same trouble when doing timestamp dates as well:
gcloud logging read "resource.type=gce_instance AND timestamp > '2021-06-15T00:00:00.000000Z'"
I get the error:
ERROR: (gcloud.logging.read) INVALID_ARGUMENT: Unparseable filter: syntax error at line 1, column 112, token ':';
Your first gcloud expression should look like this:
cloud logging read "resource.type=gce_instance AND protoPayload.authenticationInfo.principalEmail:'.*#example.com.au'"
I changed = sign to :.
And the second one like this:
gcloud logging read 'resource.type=gce_instance AND timestamp > "2021-08-15T00:00:00.000000Z"'
I exchanged single with double quotes (literally).
It's best to have a quick look at the gcloud logging read command documentation (I figured out a proper syntax this way).

Gcloud dataflow job failed to write to temp location

I am invoking dataflow job using gcloud cli. My command looks like below;
gcloud dataflow jobs run avrojob4 \
--gcs-location=gs://dataflow-templates/latest/Cloud_Bigtable_to_GCS_Avro \
--region=europe-west1 \
--parameters bigtableProjectId="project-id",bigtableInstanceId="instance-id",bigtableTableId="table-id",outputDirectory="gs://avro-data/avrojob4/",filenamePrefix="avrojob4-"
and:
ERROR: Failed to write a file to temp location 'gs://dataflow-staging-us-central1-473832897378/temp/'. Please make sure that the bucket for this directory exists, and that the project under which the workflow is running has the necessary permissions to write to it.
Can someone help me how to pass temp location as specific value through above command?
There is no --temp-location flag for this command:
https://cloud.google.com/sdk/gcloud/reference/dataflow/jobs/run
I suspect you're attempting to solve the issue by creating the flag but, as you've seen this does not work.
Does the bucket exist?
Does the Dataflow service account have suitable permissions to write to it?
Can you gsutil ls gs://dataflow-staging-us-central1-473832897378?
if yes, then it's likely that the Dataflow service does not have permission to write to the bucket. Please review the instructions in the following link for adding the correct permissions for the Dataflow (!) service account:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_cloud_storage_buckets_across_google_cloud_platform_projects

"flag provided but not defined" when trying to start google cloud sql proxy

./cloud_sql_proxy -instances=my-instance=tcp:3306 -credential-file "cloud-sql-key.json"
Running this command produces the following error:
flag provided but not defined: -credential-file
Not sure why because I do provide the flag.
The flag is -credential_file (underscore, not dash).

gcloud Export to Google Storage Bucket from Cloud SQL instance

Running this command:
gcloud sql instances export myinstance gs://my_bucket_name/filename.csv -d "mydatabase" -t "mytable"
Giving me the following error:
ERROR: (gcloud.sql.instances.import) ERROR_RDBMS
I have manually ran console uploads to the bucket which go fine. I am able to login to the sql instance and run queries. Which makes me think that there are no permission issues. Has anybody ever seen this type of error and knows a way around it?
Note: i have googled for possible situations, and most of them point to either sql or bucket permission issues.
Nvm. I figured out that i need to make an oauth connection (using the json token generated from gcloud api/credentials section) to the instance before interacting with it.

Unexpected value for default_scope

I have set up an entire environment using gcloud compute (with great success) and am now trying to script the updating of an image in an instance template. First step is to switch off auto delete on the the instance I wish to use as the base. I cannot get it to work without the following error:
$ gcloud compute --project testing-141313 instances set-disk-auto-delete mantle-test-robot-dtrc --zone europe-west1-b --device-name /dev/sda --no-auto-delete
ERROR: (gcloud.compute.instances.set-disk-auto-delete) Unexpected
value for default_scope ScopeEnum.GLOBAL, expected None or ZONE