google-cloud-sql - max_allowed_packet - google-cloud-sql

I was moving a database to using Google Cloud SQL which previously had a max_allowed_packet of 20M.
Currently the Google Cloud SQL default for max_allowed_packet is 1M.
Is there any way to increase this variable to 20M? I have already tried the following:
set global max_allowed_packet = 20971520;
Which returns:
Error Code: 1621. SESSION variable 'max_allowed_packet' is read-only. Use SET GLOBAL to assign the value
and then:
set global max_allowed_packet = 20971520;
This returns the error:
Error Code: 1227. Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Thank you in advance for your help!

To change your max_allowed_packet on Google Cloud SQL, go to the overview of your instance on the cloud console, click on edit and look for the MySQL Flags section at the bottom of the page. max_allowed_packet is one of the flags you can set there. Set the value you want, and save/confirm.

You can now set it yourself by editing the instance in Developer Console.
All the settable flags are documented here: https://cloud.google.com/sql/docs/mysql-flags

In my case I couldn't update the max_allowed_packet setting because I had a flag of sql_mode=TRADITIONAL which expects the value to be a multiple of 1024.

Related

Find out how to fix maxClauseCount is set to 1024 error in Gravitee.io

I have many API's registered in Gravitee.io. I tried to add the following:
index.query.bool.max_clause_count: 10240
To the file elasticsearch.yml
But it didn't work, I don't know how to change it in gravitee
If you are trying to change the max clause count of a query setting in Elasticsearch than correct setting is below as explained in Search settings doc
indices.query.bool.max_clause_count

MongoDB Realm: environment value exists but is undefined inside Realm Function

I am referencing an environment value from a Realm function as instructed here: context.values.get("appTwilioNumber")
I confirmed appTwilioNumber exists in our project: and that our project is assigned an environment:
Yet, when I call console.log('twilioNumberForEnv:', context.values.get("appTwilioNumber")); in our Realm function, I get twilioNumberForEnv: undefined.
EDIT 1: I have more info now--I logged out and logged back in (in case of multi-user sync issues), then exported my app from the Realm UI and the values folder is empty. Not sure if that is a separate bug, but updating in case this info is useful.
EDIT 2: the environment values are stored under environment, not under values. Edit 1 was a red herring. I do see appTwilioNumber in the exported app, but it still returns undefined in the Realm functions.
Wow... Mongo's documentation might be off.
In another function, I found this: context.environment.values.anotherEnvValue instead of context.values.get("appTwilioNumber") . So I updated my code to context.environment.values.appTwilioNumber, and it worked.
I did a CMD-f on both https://docs.mongodb.com/realm/values-and-secrets/define-environment-values/ and https://docs.mongodb.com/realm/values-and-secrets/access-a-value/ for ".environment", and it isn't on either page.
I'll follow up with Mongo, but please use context.environment.values.YOURENVVALUE instead of context.values.get("YOURENVVALUE") if you encounter this issue.

IBM Watson Assistant: How to retrieve input value from chosen dialog option?

I already followed these great instructions on how to dynamically create dialog node options from generic input and it's working like charm. But for now I cannot see how to hand over the chosen option value to the next node to process further. Is there any documentation how to pass the chosen option value to the child node?
You can store any selected option and other information in context variables. They are passed around and can be accessed in other nodes. The information is available until you unset or delete the context variable.

Dataflow: set DataflowPipelineDebugOptions

My pipeline gives OOM errors constantly so I read a fowllowing answer and try to set --dumpHeapOnOOM and --saveHeapDumpsToGcsPath. But it seems that these options do not work. Do I need to change my code or modify something else?
Memory profiling on Google Cloud Dataflow
You will want to check configuring-pipeline-options.
The current way in Apache Beam (2.9.0) to configure pipeline option in command line is --<option>=<value>.
In your case, you can set --dumpHeapOnOOM=true --saveHeapDumpsToGcsPath="gs://foo"

Change timezone in cloud sql

I have created a SQL instance on Google Cloud and I need to change the timezone. I already seen documentation, and I added the flag default_time_zone and set the value to 06:00, but the console doesn't let me write the semicolon.
How can I write the value? Thanks in advance.
The proper format for the default_time_zone flag is +/-HH:MM e.g. to set it to GMT+6 you would write the value +06:00. Don't forget the leading zero.
To modify the timezone, update the Google Cloud SQL flag named default_time_zone. This or any other database flag can be updated as follows:
1) In the Google Cloud Platform Console, open an existing project by selecting the project name.
2) Open the instance and click Edit.
3) Scroll down to the Flags section.
4) To set a flag that has not been set on the instance before, click Add item, choose the flag from the drop-down menu, and set its value.
5) Click Save to save your changes.
6) Confirm your changes under Flags on the Overview page.
When you add or modify these flags, your instance will automatically restart. Note that you cannot modify flags on failover replicas.
For further reading, see the documentation for setting Cloud SQL Flags.