How to restore permission when I am the admin of the project? - google-iam

After mistakenly add myself to a wrong role, I am no longer able to access "IAM & admin".
While trying to extract Big Query tables to Google Storage, I received the following error,
bq extract --compression GZIP Dataset.TableName gs://tableName_*.csv.gz
Waiting on bqjob_r4250d44ecf982a22_00000169c666b451_1 ... (23s) Current status: DONE
BigQuery error in extract operation: Error processing job 'Dataset:bqjob_r4250d44ecf982a22_00000169c666b451_1': Access Denied: BigQuery BigQuery: Permission denied while writing data.
I thought I may have a permission issue, therefore I change my role in Google Cloud. I don't remember what role I changed. It may be owner or creator.
After that, I am not able to to access the project in Big Query, as well as "IAM & Admin" page.
bq extract --compression GZIP Dataset.TableName gs://tableName_*.csv.gz
BigQuery error in extract operation: Access Denied: Project projectName: The user myemail#xxx.com does not have bigquery.jobs.create permission in project projectName.
Since I am the admin of this account, there is no other person who has the access. What options do I have to restore the access?
Thank you in advanced.

For this case, please open a case through the billing support form, and for "How can we hep?" select "other." https://support.google.com/cloud/contact/cloud_platform_billing
This way, I can follow up with you in private and get the details necessary to move forward. Please let me know once you submit the case and what your case number is so I can follow up.
Edit: For anyone else viewing this issue, the above method is just for this case and not the correct avenue of support for this problem. If you have a support package and you have this issue, please reach out through normal channels.
Thanks,
Hunter,
GCP Billing

Related

PERMISSION_DENIED: Permission 'documentai.processors.processOnline' denied on resource '//documentai.googleapis.com/project...'(or it may not exist)."

ISSUE: i want to use the same documents ai custom-trained processor from several different projects. the approach i have in mind is to make api calls from each of those projects to the a single service account that has the proper AIM roles. i have not been able to successfully set up a service account to access the ai processor we trained
SUMMARY: I have three different projects
DEV
STAGING
DOCUMENT AI PROCESSING
The** DOCUMENT AI PROCESSING** project contains the document AI processor which was custom-trained and the 2 other environments listed above need to access the same endpoint. I cannot find the right way to configure this, at the moment i am getting the following error: PERMISSION_DENIED: Permission 'documentai.processors.processOnline' )
BACKGROUND:
(1) I created a service account
(2) I grant this service account access to project, but did not grant any users access to the service account (item 3 in the screenshot)
(3) service account created successfully
(4) i add (as a principal) the newly created service account to the DEV project and assign it *EXACTLY* the same roles as what it has in the DOCUMENT AI PROCESSING project
(5) the service account has been granted access to the DEV project
What I expect to happen is to be able to use the Document AI processor which is located in DOCUMENT AI PROCCESSING project from the DEV project**.** However, I am still receiving the same error: PERMISSION_DENIED: Permission 'documentai.processors.processOnline' denied on resource '//documentai.googleapis.com/project...'(or it may not exist)."
After many hours, i am stumped and i am grateful to anyone that can provide an explanation of what i am getting wrong
As mentioned in the comment exchange of #Kolban and #bismar eyner esquivel ortuste, the correct permissions needed must be added to the Authorization Scope.
You may refer to this Document AI IAM roles documentation for the full list of roles for the API and refer to Document AI Processor REST API documentation for more information.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.

Connecting Excel to a Azure Devops Query

I can not see my queries using Excel Team plugin. I get an error enter image description here
TF8001:An error occurred while accessing the work item database. Contact the administrator
Please use the administrator account to check these:
First, please check whether the current query allows your account to access it.
click on the Shared query-->Security
Make sure that the related options are allowed.
Second, make sure that your account has access to the current work item.
Project Settings--> Project configuration
Finally if you execute the query to tfs work item database, you should contact your Administrator to add permission for your account.

TF400813: The user '' is not authorized to access this resource

I have my own private organization and repositories.
I also have multiple directories and all of them work except for the "Microsoft account" directory.
I am able to log into Azure Devops no problem using the Microsoft account directory.
I see my organization and I can go through my repositories, agents, pipelines, everything.
However, I can't change anything. All I get is the error or screens that don't load fully.
Its like its in read only mode.
I went into user settings to check permissions and it lets me in but only so far. It stops loading user lists after selecting groups.
It shows me groups and permissions for everything, however.
When I try and generate a PAT, the screen sits there and says "Loading Tokens..."
The error I see everywhere and in the network responses is:
$id: "1"
innerException: null
message: "TF400813: The user '' is not authorized to access this resource."
typeName: "Microsoft.TeamFoundation.Framework.Server.UnauthorizedRequestException, Microsoft.TeamFoundation.Framework.Server"
typeKey: "UnauthorizedRequestException"
errorCode: 0
eventId: 3000
Exactly like that, nothing there between the quotes.
It also shows up in Red text with just this message:
TF400813: The user '' is not authorized to access this resource.
To resolve this I have done the following:
Logged out of devops entirely, which seems to log me out of several services.
Switched between my AD accounts while logged in.
I've rebooted my machine(I first started seeing this in VS so I updated and rebooted as part of that)
Anything I'm missing here?
message: "TF400813: The user '' is not authorized to access this
resource.
This looks more like the anonymous access error as you said that there's nothing between the quotes.
In azure devops, e.g PAT generated, most services have themselves security module. When user want to make use of them, it must pass the firstly identity check. If for system, your visit and operate are identified as anonymous, it will look like read-only.
We ever handled such issue and found it due to the proxy blocking the traffic, which also lead him to get the same error when accessing azure devops with vscode.(Similar with yours)
You need confirm is there any proxy configured in your side.
If there's no proxy set but still has this issue. Since Stackflow is a open forum but this is a identity issue. I strongly suggest you contact here and then attach below info also:
Activity id: You could see this from the Headers of Network. For our backend, we could use this id to check the exactly stack
trace.
Org name and account name.
Fiddler trace. The mostly useful info we need is fiddler trace.
I tried a few options like setting the PAT in interactive screen or via environment variable or by storing it in a file and echoing that file content to the az devops login or az pipeline create command as mentioned in
https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&viewFallbackFrom=vsts&tabs=preview-page#create-personal-access-tokens-to-authenticate-access
However none of them worked. Finally it worked after I changed the token (PAT) in the file
/home//.azure/azuredevops/personalaccesstoken.
Try to sign out and the sign in again,
it's worked for me.

Google Speech API returns 403 PERMISSION_DENIED

I have been using the Google Speech API to transcribe audio to text from my PHP app (using the Google Cloud PHP Client) for several months without any problem. But my calls have now started to return 403 errors with status "PERMISSION_DENIED" and message "The caller does not have permission".
I'm using the Speech API together with Google Storage. I'm authenticating using a service account and sending my audio data to Storage. That's working, the file gets uploaded. So I understand - but I might be wrong? - that "the caller" does not have permission to then read to the audio data from Storage.
I've been playing with permissions through the Google Console without success. I've read the docs but am quite confused. The service account I am using (I guess this is "the caller"?) has owner permissions on the project. And everything used to work fine, I haven't changed a thing.
I'm not posting code because if I understand correctly my app code isn't the issue - it's rather my Google Cloud settings. I'd be grateful for any idea or clarifications of concepts!
Thanks.
Being an owner of the project doesn't necessarily imply that the service account has read permission on the object. It's possible that the object was uploaded by another account that specified a private ACL or similar.
Make sure that the service account has access to the object by giving it the right permissions on the entire bucket or on the specific object itself.
You can do so using gsutil acl. More information and additional methods may be found in the official documentation.
For instance the following command gives READ permission on an object to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket/object
And this command gives READ permission on an entire bucket to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket
In google cloud vision,when your creating credentials with service account key, you have to create role and set it owner and accesses full permissions

how to grant read permission on google cloud storage to another service account

our team create some data on google cloud storage so other team can copy/download/read it from there, but when they tried, they always got 403 forbidden message. I tried to edit the permission on that bucket and added new permission as 'Project', 'viewers-(other team's project id)', and 'Reader', but still they got the same error when they ran this command:
gsutil cp -R gs://our-bucket gs://their-bucket
i also tried with their client id and email account, still the same.
I'm not sure one can define another group's collection of users with a give access right (readers, in this case), and apply it to an object in a different project.
An alternative to this would be to control bucket access via Google Groups: simply set up a group for readers, adding the users you wish to grant this right to. Then you can use said Group to control access to the bucket and/or contents. Further information, and use case scenario, here https://cloud.google.com/storage/docs/collaboration#group
try:
gsutil acl ch -u serviceaccount#google.com:R gs://your-bucket
This ch:changes the permission on 'your-bucket' for u:user serviceaccount#google.com to R:Reader.