Google Cloud Storage Client - google-cloud-storage

I make first steps with GCS
First, I created service account in my project and got keyfile in json
Next, I tried to write script like this sample
But...
from gcloud import storage
client = storage.Client.from_service_account_json('/path/to/keyfile.json')
bucket = client.get_bucket('enggeo')
# Then do other things...
blob = bucket.get_blob('/ETicket.pdf')
print blob.download_as_string()
blob2 = bucket.blob('/wtt.txt')
blob2.upload_from_filename(filename='/home/test2/.www/test')
And getting error
File "tt.py", line 3, in <module>
client = storage.Client.from_service_account_json('/path/to/keyfile.json') # TODO: rel paths
File "/home/test2/lib/python2.7/site-packages/gcloud/client.py", line 64, in from_service_account_json
return cls(*args, **kwargs)
File "/home/test2/lib/python2.7/site-packages/gcloud/storage/client.py", line 53, in __init__
http=http)
File "/home/test2/lib/python2.7/site-packages/gcloud/client.py", line 181, in __init__
_ClientProjectMixin.__init__(self, project=project)
File "/home/test2/lib/python2.7/site-packages/gcloud/client.py", line 146, in __init__
raise ValueError('Project was not passed and could not be '
Project was not passed and could not be determined from the environment.
What is wrong?

client = storage.Client.from_service_account_json('/path/to/keyfile.json', 'project')
Need to specified args for JSONClient
https://googlecloudplatform.github.io/gcloud-python/stable/gcloud-api.html#gcloud.client.JSONClient

Related

Odoo 16: Playing with translations and Boom ! No longer can access my custom model data in my server

After successfully adding a pot file to my new i18n folder in my local machine, as well as setting "translate=True" in a couple of fields in my carddecks module, and verifying that in localhost I could acess my model data, I decided to update my server.
But when I try to access my model data I get the following error:
LINE 1: ..."write_date", COALESCE("carddecks_card"."cardText"->>'pt_PT'...
^
HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Anyone might know what may be causing this?
source code for the module can be found at https://github.com/diogocsc/carddecks
Full error:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/odoo/http.py", line 1579, in _serve_db
return service_model.retrying(self._serve_ir_http, self.env)
File "/usr/lib/python3/dist-packages/odoo/service/model.py", line 134, in retrying
result = func()
File "/usr/lib/python3/dist-packages/odoo/http.py", line 1608, in _serve_ir_http
response = self.dispatcher.dispatch(rule.endpoint, args)
File "/usr/lib/python3/dist-packages/odoo/http.py", line 1805, in dispatch
result = self.request.registry['ir.http']._dispatch(endpoint)
File "/usr/lib/python3/dist-packages/odoo/addons/website/models/ir_http.py", line 235, in _dispatch
response = super()._dispatch(endpoint)
File "/usr/lib/python3/dist-packages/odoo/addons/base/models/ir_http.py", line 144, in _dispatch
result = endpoint(**request.params)
File "/usr/lib/python3/dist-packages/odoo/http.py", line 698, in route_wrapper
result = endpoint(self, *args, **params_ok)
File "/usr/lib/python3/dist-packages/odoo/addons/web/controllers/dataset.py", line 42, in call_kw
return self._call_kw(model, method, args, kwargs)
File "/usr/lib/python3/dist-packages/odoo/addons/web/controllers/dataset.py", line 33, in _call_kw
return call_kw(request.env[model], method, args, kwargs)
File "/usr/lib/python3/dist-packages/odoo/api.py", line 457, in call_kw
result = _call_kw_model(method, model, args, kwargs)
File "/usr/lib/python3/dist-packages/odoo/api.py", line 430, in _call_kw_model
result = method(recs, *args, **kwargs)
File "/usr/lib/python3/dist-packages/odoo/addons/web/models/models.py", line 62, in web_search_read
records = self.search_read(domain, fields, offset=offset, limit=limit, order=order)
File "/usr/lib/python3/dist-packages/odoo/models.py", line 4968, in search_read
result = records.read(fields, **read_kwargs)
File "/usr/lib/python3/dist-packages/odoo/models.py", line 2992, in read
self._read(stored_fields)
File "/usr/lib/python3/dist-packages/odoo/models.py", line 3235, in _read
cr.execute(query_str, params + [sub_ids])
File "/usr/lib/python3/dist-packages/odoo/sql_db.py", line 315, in execute
res = self._obj.execute(query, params)
psycopg2.errors.UndefinedFunction: operator does not exist: character varying ->> unknown
LINE 1: ..."write_date", COALESCE("carddecks_card"."cardText"->>'pt_PT'...
^
HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
The above server error caused the following client error:
OwlError: The following error occurred in onWillStart: "Odoo Server Error"
at wrapError (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1445:77)
at onWillStart (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1451:117)
at useModel (https://www.relationalgames.com/web/assets/505-11285c6/web.assets_backend.min.js:4709:1)
at ListController.setup (https://www.relationalgames.com/web/assets/505-11285c6/web.assets_backend.min.js:4430:645)
at new ComponentNode (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1407:136)
at https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1929:6
at View.slot1 (eval at compile (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1892:370), <anonymous>:15:36)
at callSlot (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1508:25)
at WithSearch.template (eval at compile (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1892:370), <anonymous>:8:12)
at Fiber._render (https://www.relationalgames.com/web/assets/504-41b52e3/web.assets_common.min.js:1336:96)
Caused by: RPC_ERROR: Odoo Server Error
at makeErrorFromResponse (https://www.relationalgames.com/web/assets/505-11285c6/web.assets_backend.min.js:967:163)
at XMLHttpRequest.<anonymous> (https://www.relationalgames.com/web/assets/505-11285c6/web.assets_backend.min.js:974:13)
Found the answer for this. In my localhost I was updating my module. In my server i was just running docker-compose up. Not updating the module in the db.
The solution was going to settings -> apps search by cardecks and upgrade it.

Airflow - email operator sending multiple files issue

I am using airflow 2.2. I am trying to send multiple files using airflow email operator. the files list will be generated dynamically and using XCom pull to get the list of files from the previous task. For some reason, email operator files parameter is NOT able to read the files list from XCom value. Kindly advise.
Error details:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 292, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 184, in _run_raw_task
error_file=args.error_file,
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
result = execute_callable(context=context)
File "/usr/local/lib/python3.7/site-packages/airflow/operators/email.py", line 88, in execute
conn_id=self.conn_id,
File "/usr/local/lib/python3.7/site-packages/airflow/utils/email.py", line 66, in send_email
**kwargs,
File "/usr/local/lib/python3.7/site-packages/airflow/utils/email.py", line 99, in send_email_smtp
mime_charset=mime_charset,
File "/usr/local/lib/python3.7/site-packages/airflow/utils/email.py", line 157, in build_mime_message
with open(fname, "rb") as file:
FileNotFoundError: [Errno 2] No such file or directory: '['
[2022-09-18, 17:24:22 UTC] {{local_task_job.py:154}} INFO - Task exited with return code 1
[2022-09-18, 17:24:23 UTC] {{local_task_job.py:264}} INFO - 0 downstream tasks scheduled from follow-on schedule check
from airflow import DAG
from airflow.operators.email import EmailOperator
from airflow.operators.python import PythonOperator
import os
from datetime import datetime, timedelta
default_args = {
"owner": 'TEST',
"depends_on_past": False,
"email_on_failure": False,
"email_on_retry": False,
"retries": 0,
}
with DAG(
dag_id="test9_email_operator_dag",
default_args=default_args,
start_date=datetime(2022, 9, 14),
end_date=datetime(2022, 9, 15),
catchup=True,
max_active_runs=1,
schedule_interval="0 12 * * *", # Runs every day # 8AM EST
) as dag:
def print_local_folder_files(local_temp_folder):
print("local folder files => ", os.listdir(local_temp_folder))
files_list = []
for file in os.listdir(local_temp_folder):
files_list.append(local_temp_folder + file)
print("files_list => ", files_list)
return files_list
print_local_folder_files = PythonOperator(
task_id='print_local_folder_files',
python_callable=print_local_folder_files,
op_kwargs={'local_temp_folder': "/usr/local/airflow/dags/temp_dir/"},
do_xcom_push=True,
provide_context=True,
dag=dag)
send_email = EmailOperator(
task_id='send_email',
to='test#gmail.com',
subject='Test Email op Notification',
html_content='Test email op notification email. ',
files="{{ task_instance.xcom_pull(task_ids='print_local_folder_files') }}"
)
print_local_folder_files >> send_email
You pushed a list to Xcom but Xcoms are rendered as string by default so what you have there is a string representation of list. This is why when you try to read it you get the first char because when you iterate over a string you get it char by char.
To solve your issue you should set render_template_as_native_obj=True on the DAG object:
with DAG(
dag_id="test9_email_operator_dag",
...,
render_template_as_native_obj=True,
) as dag:
This will let Jinja engine know that you expect to render as native Python types so you will get a list rather than a string.
For more information check Airflow docs on this feature.

Long URL (including a key) causes unicode idna codec decoding error whilst using RedisChannelLayer in Django Channels

I've encountered a problem deploying Django Channels on Heroku, whilst using a RedisChannelLayer.
I get a UnicodeError: encoding with 'idna' codec failed (UnicodeError: label empty or too long) during connection (full traceback below).
That seems to be a problem related to one of the labels in the host address being too long as shown in this python issue.
I printed some information out of my consumer, and also wrapped python's socket.getaddrinfo module to display host and connection information.
This related post has has the same problem connecting to shopify, not a redis instance, where they got around it by placing credentials into the request header. But I don't have control over channels_redis or asyncio.
Any clues?
Properties of the Django Channels Consumer:
.groups []
.channel_layer RedisChannelLayer(hosts=[{'address': ('h:alongkeycomprisingof65charsintotalxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx#ec2-18-202-152-61.eu-west-1.compute.amazonaws.com', '9759')}])
.channel_name specific.OSfTzyqY!pdvgHnaCxWiv
.room_name 8e3d3083-8bb1-4d85-89d3-4496d9b9e946
.room_group_name twined_8e3d3083-8bb1-4d85-89d3-4496d9b9e946
Full Traceback:
Traceback (most recent call last):
File "/app/.heroku/python/lib/python3.6/site-packages/channels/sessions.py", line 183, in __call__
return await self.inner(receive, self.send)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/middleware.py", line 41, in coroutine_call
await inner_instance(receive, send)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/consumer.py", line 59, in __call__
[receive, self.channel_receive], self.dispatch
File "/app/.heroku/python/lib/python3.6/site-packages/channels/utils.py", line 51, in await_many_dispatch
await dispatch(result)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/consumer.py", line 73, in dispatch
await handler(message)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/generic/websocket.py", line 175, in websocket_connect
await self.connect()
File "/app/backend/pink/consumers.py", line 24, in connect
await self.channel_layer.group_add(self.room_group_name, self.channel_name)
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 589, in group_add
async with self.connection(self.consistent_hash(group)) as connection:
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 835, in __aenter__
self.conn = await self.pool.pop()
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 73, in pop
conns.append(await aioredis.create_redis(**self.host, loop=loop))
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/commands/__init__.py", line 175, in create_redis
loop=loop)
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/connection.py", line 113, in create_connection
timeout)
File "/app/.heroku/python/lib/python3.6/asyncio/tasks.py", line 339, in wait_for
return (yield from fut)
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/stream.py", line 24, in open_connection
lambda: protocol, host, port, **kwds)
File "/app/.heroku/python/lib/python3.6/asyncio/base_events.py", line 750, in create_connection
infos = f1.result()
File "/app/.heroku/python/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "./backend/amy/asgi.py", line 69, in mygetaddrinfo
for res in socket._socket.getaddrinfo(host, port, family, type, proto, flags):
UnicodeError: encoding with 'idna' codec failed (UnicodeError: label empty or too long)
I've been able to work around by establishing my RedisChannelLayer using a dict of arguments to create_connection (as mentioned here, instead of providing a really long host name.
By manually parsing the password out of the REDIS_URL variable that heroku provides me, and rebuilding a host uri without it, I can add that password as a separate field to the create_connection dict, keeping the host string length below 64 characters.
I do this in my settings.py file which now looks like:
def parse_redis_url(url):
""" parses a redis url into component parts, stripping password from the host.
Long keys in the url result in parsing errors, since labels within a hostname cannot exceed 64 characters under
idna rules.
In that event, we remove the key/password so that it can be passed separately to the RedisChannelLayer.
Heroku REDIS_URL does not include the DB number, so we allow for a default value of '0'
"""
parsed = urlparse(url)
parts = parsed.netloc.split(':')
host = ':'.join(parts[0:-1])
port = parts[-1]
path = parsed.path.split('/')[1:]
db = int(path[0]) if len(path) >= 1 else 0
user, password = (None, None)
if '#' in host:
creds, host = host.split('#')
user, password = creds.split(':')
host = f'{user}#{host}'
return host, port, user, password, db
REDIS_URL = env('REDIS_URL', default='redis://localhost:6379')
REDIS_HOST, REDIS_PORT, REDIS_USER, REDIS_PASSWORD, REDIS_DB = parse_redis_url(REDIS_URL)
# DJANGO CHANNELS
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
"hosts": [{
'address': f'redis://{REDIS_HOST}:{REDIS_PORT}',
'db': REDIS_DB,
'password': REDIS_PASSWORD,
}],
},
},
}

Unable to set credentials with Google texttoaudio API

I was able to set to set the credentials for Google's translation but with texttospeech I'm having a lot of trouble. First, I couldn't get the credentials as a json file, but got it as just a string. I haven't been able to find anyone else whose credentials are strings every one else has a json file. I converted the string to a json file but I don't think that is helping because it seems that they json object has to be a dictionary. In any case when I try this:
from google.oauth2 import service_account
key1 = 'key.json'
credentials = service_account.Credentials.from_service_account_file(key1)
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 3267, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-13-c7d030662a35>", line 1, in <module>
credentials = service_account.Credentials.from_service_account_file(key1)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/oauth2/service_account.py", line 209, in from_service_account_file
filename, require=['client_email', 'token_uri'])
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_service_account_info.py", line 73, in from_filename
return data, from_dict(data, require=require)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_service_account_info.py", line 46, in from_dict
missing = keys_needed.difference(six.iterkeys(data))
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/six.py", line 575, in iterkeys
return iter(d.keys(**kw))
AttributeError: 'str' object has no attribute 'keys'
When I try this code the following happens:
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = 'key.json'
client = texttospeech.TextToSpeechClient()
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 3267, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-14-d30e5cd41087>", line 2, in <module>
client = texttospeech.TextToSpeechClient()
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/texttospeech_v1/gapic/text_to_speech_client.py", line 159, in __init__
address=api_endpoint, channel=channel, credentials=credentials
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/texttospeech_v1/gapic/transports/text_to_speech_grpc_transport.py", line 61, in __init__
channel = self.create_channel(address=address, credentials=credentials)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/texttospeech_v1/gapic/transports/text_to_speech_grpc_transport.py", line 91, in create_channel
address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 177, in create_channel
credentials, _ = google.auth.default(scopes=scopes)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_default.py", line 305, in default
credentials, project_id = checker()
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_default.py", line 165, in _get_explicit_environ_credentials
os.environ[environment_vars.CREDENTIALS])
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_default.py", line 102, in _load_credentials_from_file
credential_type = info.get('type')
AttributeError: 'str' object has no attribute 'get'
I think this is because my json object is not a dict but a string. But the key that Google gave me was a string and not a json file, so I really don't know what to do here. Plus their documentation is too hard to understand.
When you say you have a string, are you referring to the Key ID? You will still need the json file associated with that key.
To create a new json file, go to Google Cloud Console -> IAM & Admin -> Service Accounts. Select one of your service accounts and click "Create Key" which will download the key as a json file. It can only be downloaded once.

PermissionDenied: 403 error when trying to run async Google Cloud Speech async transcribe

I'm getting the following error when trying to run an async transcription request on a .flac file hosted on google cloud.
$ python3 transcribe_async.py gs://[file].flac
Traceback (most recent call last):
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 54, in error_remapped_callable
return callable_(*args, **kwargs)
File "[]/anaconda3/lib/python3.6/site-packages/grpc/_channel.py", line 514, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "[]/anaconda3/lib/python3.6/site-packages/grpc/_channel.py", line 448, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
status = StatusCode.PERMISSION_DENIED
details = "The caller does not have permission"
debug_error_string = "{"created":"#1533912393.258761000","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1095,"grpc_message":"The caller does not have permission","grpc_status":7}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "transcribe_async.py", line 105, in <module>
transcribe_gcs(args.path)
File "transcribe_async.py", line 83, in transcribe_gcs
operation = client.long_running_recognize(config, audio)
File "[]/anaconda3/lib/python3.6/site-packages/google/cloud/speech_v1/gapic/speech_client.py", line 284, in long_running_recognize
request, retry=retry, timeout=timeout, metadata=metadata)
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 139, in __call__
return wrapped_func(*args, **kwargs)
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/retry.py", line 260, in retry_wrapped_func
on_error=on_error,
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/retry.py", line 177, in retry_target
return target()
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/timeout.py", line 206, in func_with_timeout
return func(*args, **kwargs)
File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 56, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission
I've added an export statement to my .zshrc file that points to the service account json, I've added myself, the service account email and the project owner and editor as owners of the cloud bucket through the browser, and I ran gcloud auth activate-service-account --key-file="[].json", but nothing helps. What have I forgotten? Any help much appreciated.
You need to make your file publicly readable. Once you set the permissions to allUsers, you will be able to use your file in your request.