Unable to authenticate using google cloud service account key created by python API - google-authentication

The sample below demonstrates failure to authenticate to google service account using the key created just the few lines above using python api.
I was not able to find any document on how these, programmatic keys, can be used.
The keys created by clicking thru console UI are working just fine.
However, for our use case, we need to create the keys using programmatic ways.
There is unanswered issue at github as well: https://github.com/googleapis/google-cloud-python/issues/7824
logger.info("Created new service account: {}".format(ret))
logger.info("Getting the new service account key")
request=iam.projects().serviceAccounts().keys().create(name=ret['name'],
body={'privateKeyType':'TYPE_GOOGLE_CREDENTIALS_FILE'})
key=request.execute()
>>>print json.dumps(key, indent=4) #just to verify what we got
{
"keyOrigin": "GOOGLE_PROVIDED",
"name": "goodandvalidname",
"validBeforeTime": "2029-06-28T15:09:59Z",
"privateKeyData": "datadata",
"privateKeyType": "TYPE_GOOGLE_CREDENTIALS_FILE",
"keyAlgorithm": "KEY_ALG_RSA_2048",
"validAfterTime": "2019-07-01T15:09:59Z"
}
>>> credentials = google.oauth2.service_account.Credentials.from_service_account_info(key)
Traceback (most recent call last):
File "/home/user/.p2/pool/plugins/org.python.pydev.core_7.2.1.201904261721/pysrc/_pydevd_bundle/pydevd_exec.py", line 3, in Exec
exec exp in global_vars, local_vars
File "<console>", line 1, in <module>
File "/home/user/.local/lib/python2.7/site-packages/google/oauth2/service_account.py", line 193, in from_service_account_info
info, require=['client_email', 'token_uri'])
File "/home/user/.local/lib/python2.7/site-packages/google/auth/_service_account_info.py", line 51, in from_dict
'fields {}.'.format(', '.join(missing)))
ValueError: Service account info was not in the expected format, missing fields token_uri, client_email.
Any help appreciated.

Answering my own issue and probably helping others...
The 'key' we get from python APIs is NOT the 'json key' as obtained from gcloud. The dict we get from iam.projects().serviceAccounts().keys().create() contains the field privateKeyData which itself contains ENTIRE 'json key' one needs to authenticate to google cloud.
The data in this field is base64 encoded and needs decoding, and subsequently dumping to json. Below is the snippet from functional code, demonstrating the credentials are loaded back from such key:
request=iam.projects().serviceAccounts().keys().create(name=ret['name'],
body={'privateKeyType':'TYPE_GOOGLE_CREDENTIALS_FILE'})
key=request.execute()
key=base64.decodestring(key['privateKeyData'])
key=json.loads(key)
credentials = google.oauth2.service_account.Credentials.from_service_account_info(key)
I figured this out by stepping thru gcloud service account key creating, line by line, using python debugger. Hope this helps others.

Related

flutter_stripe example app has 18 errors all "Object is of type 'unknown'." for 'error' variables

I'm trying to run flutter_stripe's example app. I forked and cloned the Github repository to my laptop.
Starting the yarn server results in 18 errors. All start with Object is of type 'unknown'. All are error or e or err, on lines 130, 301, 442, 450, 451, 455, 456, 464, 578, 586, 587, 591, 592, 595, 599, and 600. Then it says Command failed with exit code 2.
Is this a null safety issue? How do I fix it?
Your existing github issue with the library maintainers is likely to be your best source of help, however reading that I noticed you said:
In the last step, setting up server/.env, my Stripe account has pk_test and a pk_live Publishable and Secret Keys. My guess is that I should use the pk_test keys in server/.env.example. Let’s make this clear in the comment at the top of server/.env.example.
This seems to be a misunderstanding of your Stripe API keys. There are secret keys (sk_) for your server and publishable keys (pk_) for your client-side application as a matching pair, and there is a pair for each of live and test mode. You need to use a matching secret and publishable key from your dashboard.
Additionally, when setting up secrets in environment files, you'll typically be creating a .env file in the server/repo root directory. I read the above as though you might be trying to set up your keys in the .env.example file which I don't expect would work. You should check with the developer of the library/example about this if .env doesn't work.

How to create a bucket using the python SDK?

I'm trying to create a bucket in cloud object storage using python. I have followed the instructions in the API docs.
This is the code I'm using
COS_ENDPOINT = "https://control.cloud-object-storage.cloud.ibm.com/v2/endpoints"
# Create client
cos = ibm_boto3.client("s3",
ibm_api_key_id=COS_API_KEY_ID,
ibm_service_instance_id=COS_INSTANCE_CRN,
config=Config(signature_version="oauth"),
endpoint_url=COS_ENDPOINT
)
s3 = ibm_boto3.resource('s3')
def create_bucket(bucket_name):
print("Creating new bucket: {0}".format(bucket_name))
s3.Bucket(bucket_name).create()
return
bucket_name = 'test_bucket_442332'
create_bucket(bucket_name)
I'm getting this error - I tried setting CreateBucketConfiguration={"LocationConstraint":"us-south"}, but it doesnt seem to work
"ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to."
Resolved by going to https://cloud.ibm.com/docs/cloud-object-storage?topic=cloud-object-storage-endpoints#endpoints
And choosing the endpoint specific to the region I need. The "Endpoint" provided with the credentials, is not the actual endpoint.

IBM Language Translator returns 403 Forbidden upon identify()

I followed the official documentation to create a multilingual Watson assistant outlined here:
https://github.com/with-watson/multilingual-chatbot
However, after deploying the function on IBM Cloud and testing the deployed function via IBM Cloud CLI with the below command, I am getting an error (logs below):
bx wsk action invoke translator --result --param text "Hallo, ich habe eine Frage."
{
"error": "The action did not return a dictionary."
}
"2020-01-13T12:54:57.787506Z stderr: Traceback (most recent call last):",
"2020-01-13T12:54:57.787554Z stderr: File \"pythonrunner.py\", line 88, in run",
"2020-01-13T12:54:57.787560Z stderr: exec('fun = %s(param)' % self.mainFn, self.global_context)",
"2020-01-13T12:54:57.787564Z stderr: File \"<string>\", line 1, in <module>",
"2020-01-13T12:54:57.787568Z stderr: File \"__main__.py\", line 98, in main",
"2020-01-13T12:54:57.787571Z stderr: response = translator.identify( text )",
"2020-01-13T12:54:57.787575Z stderr: File \"/action/virtualenv/lib/python3.6/site-packages/watson_developer_cloud/language_translator_v3.py\", line 193, in identify",
"2020-01-13T12:54:57.787579Z stderr: accept_json=True)",
"2020-01-13T12:54:57.787583Z stderr: File \"/action/virtualenv/lib/python3.6/site-packages/watson_developer_cloud/watson_service.py\", line 587, in request",
"2020-01-13T12:54:57.787587Z stderr: info=error_info, httpResponse=response)",
"2020-01-13T12:54:57.787591Z stderr: watson_developer_cloud.watson_service.WatsonApiException: Error: Forbidden, Code: 403",
"2020-01-13T12:54:57.788Z stderr: The action did not initialize or run as expected. Log data might be missing."
Looks like the API key is recognized but not permitted to be used for this action, however the key being used does return the right values when used via cURL.
The code executed in main is the same as provided on the Github above, I did not make any changes.
Any ideas on how to fix this issue? Thanks!
The key string used by curl is a bearer token. The API key needed by the cloud function is probably one provided by Identity and Access Management, IAM.
In the https://cloud.ibm.com console GUI in the top click Manage > Access (IAM) then select the IBM Cloud API keys on the left and select an API key. This creates an API key that represents you, just like login name and credentials. This is the simplest way to get this to work, but is not great for production.
For production consider using a Service ID and probably in combination with Access Group.
Here's what worked for me with additional changes
I have run the below command to update the packages mentioned in the environment.yml file
conda update --all
The conda version on my machine is 4.8.1
cloud-functions/wsk/functions/fn plugin version is 1.0.36
While creating Language Translator instance make sure to choose the right region.
It worked for me after I changed it.

Google Cloud authorization keeps failing with Python 3 - Type is None, expected one of ('authorized_user', 'service_account')

I am trying to download a file for the first time from Google Cloud Storage.
I set the path to the googstruct.json service account key file that I downloaded from https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-python
Do need to set the authorization to Google Cloud outside the code somehow? Or is there a better "How to use Google Cloud Storage" then the one on the google site?
It seems like I am passing the wrong type to the storage_client = storage.Client()
the exception string is below.
Exception has occurred: google.auth.exceptions.DefaultCredentialsError
The file C:\Users\Cary\Documents\Programming\Python\QGIS\GoogleCloud\googstruct.json does not have a valid type.
Type is None, expected one of ('authorized_user', 'service_account').
MY PYTHON 3.7 CODE
from google.cloud import storage
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="C:\\GoogleCloud\\googstruct.json"
# Instantiates a client
storage_client = storage.Client()
bucket_name = 'structure_ssi'
destination_file_name = "C:\\Users\\18809_PIPEM.shp"
source_blob_name = '18809_PIPEM.shp'
download_blob(bucket_name, source_blob_name, destination_file_name)
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
print('Blob {} downloaded to {}.'.format(
source_blob_name,
destination_file_name
)
)
I did look at this but I cannot tell if this is my issue. I have tried both.
('Unexpected credentials type', None, 'Expected', 'service_account') with oauth2client (Python)
This error means that the Json Service Account Credentials that you are trying to use C:\\GoogleCloud\\googstruct.json are corrupt or the wrong type.
The first (or second) line in the file googstruct.json should be "type": "service_account".
Another few items to improve your code:
You do not need to use \\, just use / to make your code easier
and cleaner to read.
Load your credentials directly and do not modify environment
variables:
storage_client = storage.Client.from_service_account_json('C:/GoogleCloud/googstruct.json')
Wrap API calls in try / except. Stack traces do not impress customers. It is better to have clear, simple, easy to read error messages.

Migrating Authorization Mechanics ClientLogin to OAuth2 Google AdWords v201206 Perl

So I previously was using ClientLogin for authorization in the Google AdWords API but after looking at AuthForInstalledApps it says that this mechanism has been deprecated in favor of OAuth2.0
I have registered my application in the API Consoles. Now I am trying to follow the perl example of how to set this up:
use_oauth2.pl
I cannot place the AdWords Client, credentials in ~/adwords.properties since I have multiple accounts that I will be doing Campaign Management operations on, and therefore multiple client ids.
But for now I tried to follow this example by using just one of my client's info like this:
my $client = Google::Ads::AdWords::Client->new(
{
version => 'v201206',
developer_token => TOKEN,
client_id => $google_account_id
}
);
$client->get_auth_token_handler()->set_email($login);
$client->get_auth_token_handler()->set_password($password);
However when I step through this, when it tries to initialize the Client object it throws this error:
Can't use an undefined value as a HASH reference at (eval 845)[/usr/lib/perl5/vendor_perl/5.8.8/HTTP/Message.pm:371] line 1. at (eval 845)[/usr/lib/perl5/vendor_perl/5.8.8/HTTP/Message.pm:371] line 1
HTTP::Message::__ANON__[(eval 845)[/usr/lib/perl5/vendor_perl/5.8.8/HTTP/Message.pm:371]:1]() called at /home/etienne/backend/libs/Google/Ads/Common/HTTPTransport.pm line 30
Google::Ads::Common::HTTPTransport::client('Google::Ads::Common::HTTPTransport=HASH(0xb59b830)', 'Google::Ads::AdWords::Client=SCALAR(0x9b9bb60)') called at /home/etienne/backend/libs Google/Ads/AdWords/Client.pm line 180
Google::Ads::AdWords::Client::START('Google::Ads::AdWords::Client=SCALAR(0x9b9bb60)', 1, 'HASH(0xb582e70)') called at /usr/lib/perl5/site_perl/5.8.8/Class/Std/Fast.pm line 251
Class::Std::Fast::__ANON__[/usr/lib/perl5/site_perl/5.8.8/Class/Std/Fast.pm:252]() called at /usr/lib/perl5/site_perl/5.8.8/Class/Std/Fast.pm line 287
Class::Std::Fast::new('Google::Ads::AdWords::Client', 'HASH(0xb54c210)') called at /home/etienne/backend/search_marketing/data_exchange/lib/GoogleAPIv2.pm line 3555
GoogleAPIv2::get_adwords_client('GoogleAPIv2=HASH(0xb556d10)', 4202697829) called at /home/etienne/backend/search_marketing/data_exchange/lib/GoogleAPIv2.pm line 230
GoogleAPIv2::add_campaign('GoogleAPIv2=HASH(0xb556d10)', 'name', 'API Upgrade Test Campaign - 1348613850', 'google_account_id', 4202697829, 'account_id', 207, 'country_code', 'US', ...) called at google_add_campaign.t line 110
main::main() called at google_add_campaign.t line 26
scalar context return from CODE(0x9b909c0): *Class::Std::Fast::_cache
1..3
I am using Perl v5.8.8 . I have installed the latest Adwords Perl Client Library v2.7.2 . Is there some kind of dependency issue?
And how can I go about resolving this issue. Any information you can provide would be very helpful. Thanks.
It looks like I just needed to have the latest HTTP/Message.pm v6.0.3 module installed since the version I had installed did not have the method decode() which was being passed to the AUTOLOAD() subroutine in the HTTP/Message.pm module.