gcloud logging with regular expression - gcloud

I'm trying to use gcloud logging along with regular expression. My query works in the console but I can't get it going via the CLI.
gcloud logging read "resource.type=gce_instance AND protoPayload.authenticationInfo.principalEmail=~'.*#example.com.au'" --limit=10 --format=json
I get the error:
ERROR: (gcloud.logging.read) INVALID_ARGUMENT: Unparseable filter: unrecognized node at token 'MEMBER'
I've tried with and without various quotes '' "" "\"\
I also have the same trouble when doing timestamp dates as well:
gcloud logging read "resource.type=gce_instance AND timestamp > '2021-06-15T00:00:00.000000Z'"
I get the error:
ERROR: (gcloud.logging.read) INVALID_ARGUMENT: Unparseable filter: syntax error at line 1, column 112, token ':';

Your first gcloud expression should look like this:
cloud logging read "resource.type=gce_instance AND protoPayload.authenticationInfo.principalEmail:'.*#example.com.au'"
I changed = sign to :.
And the second one like this:
gcloud logging read 'resource.type=gce_instance AND timestamp > "2021-08-15T00:00:00.000000Z"'
I exchanged single with double quotes (literally).
It's best to have a quick look at the gcloud logging read command documentation (I figured out a proper syntax this way).

Related

`--trigger-resource' description error for cloud firestore trigger

I am trying an example of Cloud Functions trigger based on Cloud Firestore. While deploying the function using gcloud, I am getting this error:
gcloud functions deploy hello_firestore --runtime python32
--trigger-event providers/cloud.firestore/eventTypes/document.update
--trigger-resource projects/patch-us/databases/ (default)
/documents/books/{booksid}
bash: syntax error near unexpected token `('
Can someone point out whats wrong with the command line?
It was a very stupid mistake..
gcloud functions deploy hello_firestore --runtime python32 --trigger-event providers/cloud.firestore/eventTypes/document.update --trigger-resource "projects/patch-us/databases/(default)/documents/books/{booksid}"
The path needs to be within inverted commas.

"flag provided but not defined" when trying to start google cloud sql proxy

./cloud_sql_proxy -instances=my-instance=tcp:3306 -credential-file "cloud-sql-key.json"
Running this command produces the following error:
flag provided but not defined: -credential-file
Not sure why because I do provide the flag.
The flag is -credential_file (underscore, not dash).

MongoDB Charts Error while testing connection

I am trying to deploy mongo charts on containers. While testing i get below error
I tried serching web .. but did not get right fix
docker run --rm quay.io/mongodb/charts:19.06.1 charts-cli test-connection mongodb://unsername:password#mon009.abc.com:7041,mon001.abc.com:7041/pub_mongo?replicaSet=mongo7041
i get below error .. any clue??
Unable to connect to MongoDB using the specified URI.
The following error was returned while attempting to connect:
MongoParseError: Incomplete key value pair for option
I can't repro the problem with the supplied URI. The only thing I can think of is that the redacted password may contain a special character such as ? which is preventing the URI from being parsed. Make sure you are URL-encoding any special characters in the password.

gcloud Export to Google Storage Bucket from Cloud SQL instance

Running this command:
gcloud sql instances export myinstance gs://my_bucket_name/filename.csv -d "mydatabase" -t "mytable"
Giving me the following error:
ERROR: (gcloud.sql.instances.import) ERROR_RDBMS
I have manually ran console uploads to the bucket which go fine. I am able to login to the sql instance and run queries. Which makes me think that there are no permission issues. Has anybody ever seen this type of error and knows a way around it?
Note: i have googled for possible situations, and most of them point to either sql or bucket permission issues.
Nvm. I figured out that i need to make an oauth connection (using the json token generated from gcloud api/credentials section) to the instance before interacting with it.

Unexpected value for default_scope

I have set up an entire environment using gcloud compute (with great success) and am now trying to script the updating of an image in an instance template. First step is to switch off auto delete on the the instance I wish to use as the base. I cannot get it to work without the following error:
$ gcloud compute --project testing-141313 instances set-disk-auto-delete mantle-test-robot-dtrc --zone europe-west1-b --device-name /dev/sda --no-auto-delete
ERROR: (gcloud.compute.instances.set-disk-auto-delete) Unexpected
value for default_scope ScopeEnum.GLOBAL, expected None or ZONE