Azure media service v3 - Create job with sas url is failing due to Access issue - azure-media-services

I'm trying to create a asset from code, but i'm getting below error:
{
"error": {
"code": "Conflict",
"message": "The server received a 403 Forbidden error when accessing Azure Storage. Please check your permissions to the storage accounts linked to the media account.",
"details": [
{
"code": "AuthorizationFailure",
"message": "The server received a 403 Forbidden error when accessing Azure Storage. Please check your permissions to the storage accounts linked to the media account."
}
]
}
}
Also, I tried directly in portal with generated sas url, though I'm facing access issue, I can confirm AAD service principle has assigned "contributor" role, but still I get error.
Error:
The client 'xx' with object id 'xx' does not have authorization to perform action 'Microsoft.Media/mediaservices/assets/write' over scope '/subscriptions/xx/resourceGroups/xx/providers/Microsoft.Media/mediaservices/itskssearchmediadev/assets/ignite-mp4-20220207-192422' or the scope is invalid. If access was recently granted, please refresh your credentials.
What else permission do I need to provide?
Note: I also tried with my personal a/c which has full access, it works there.

The Storage Account Contributor role permits management of storage accounts (e.g., creating and deleting storage accounts), but it does not permit access to data in the storage account.
To allow Media Services to write to the storage account, the Managed Identity must be granted a role that has access to the storage account data, for example, Storage Blob Data Contributor.

Related

Problem regarding google cloud bucket access permission

I am working on a colab project with google cloud bucket. At first, I use my own Gmail account A, but I notice that I need a google service account for some operations. So I activate a service account B and I successfully log in with this service account.
But here are still a permission error:
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"code": 403,
"message": "gmailaccountA#gmail.com does not have storage.objects.list access to the Google Cloud Storage bucket.",
"errors": [
{
"message": "gmailaccountA#gmail.com does not have storage.objects.list access to the Google Cloud Storage bucket.",
"domain": "global",
"reason": "forbidden"
}
]
}
}
When I double check and run the "gcloud auth list", I get two active accounts, one is my gmail account A and one is my service account B. How could I make sure I am using the service account?
In order to set the account you want to use you can first list them to check the ones available using
gcloud auth list
and to set the chosen one use:
gcloud config set account ACCOUNT
You can read more about the gcloud config set command and its properties here

Microsoft Azure API List All Tenants

We are a CSP partner with MS. My goal is to call the Azure API and list all the different tenants we have in our account.
I found this Azure API resource that appears to allow the listing of all tenants: https://learn.microsoft.com/en-us/rest/api/resources/Tenants/List
I’ve been able to implement the authorization code flow, and I can call MS Graph API’s successfully. However when I try to call this API I receive this response:
{
"error": {
"code": "AuthenticationFailed",
"message": "Authentication failed."
}
}
I feel like it may an issue with the permissions I've granted in my app registration, but I can't seem to figure what is needed to make it happen.
I wish there was a way to use the MS Graph API to get all of our tenants, but from my research that doesn't exist.
I think you missed the bear token. I test this api by postman.
You can read this article.
Get an Azure Active Directory token using Azure Active Directory Authentication Library

Signed URL created by Google Cloud Storage Python library blob.generate_signed_url method gets "Access Denied" error

I am trying to create a signed URL for a private object stored in cloud storage.
The storage client is being created using a service account that has the Storage Admin role:
storage_client = storage.Client.from_service_account_json('service.json')
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET"
)
This generates a URL that when accessed via a browser gives this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.</Details>
</Error>
What am I missing here? The service account has no problem interacting with the bucket or blob normally - I can download it/etc. It's just the Signed URL that doesn't work. I can make the object public and then download it - but that defeats the purpose of being able to generate a signed URL.
All of the other answers I've found seem to focus on issues using application default credentials or are very old examples from the v2 API.
Clearly there's something about how I'm using the service account - do I need to explicitly give it permissions on that particular object? Is the Storage Admin role not enough in this context?
Going crazy with this. Please help!

Access to cloud storage from client URL

From a Google cloud application, I need to open a file located in my project’s cloud storage. I tried to use a URL of the following form to access the file but I get the error below:
http://storage.googleapis.com/my-bucket/my-file
Error: Access denied. Anonymous caller does not have storage objects
www.googleapis.com/upload/storage/v1/b/http://my_appl//my-bucket/my-file
Error 404
www.googleapis.com/storage/v1/b/my-bucket/my-file
Error 404
https://www.googleapis.com/storage/v1/b/my-bucket/o/my-file
"code": 401,
"message": "Anonymous caller does not have storage.objects.get access to my-bucket/my-file
https://www.googleapis.com/storage/v1/b/my-bucket/o/my-file/place?key=my-key
Not found
Am I composing the URL incorrectly?
http://storage.googleapis.com/my-bucket/my-file
This one is fine. However, unless an object is publicly readable, you'll need to authorize the request, which means either including an "Authorization" header in the request with appropriate credentials or signing the URL with the private key of a service account.
https://www.googleapis.com/download/storage/v1/b/my-bucket/o/my-file?key=my-key&alt=media
This is also okay, but an API key does not provide authentication. You'll still need an Authorization header unless the object is publicly viewable.

storage.buckets().insert() for xyz.domain.com bucket not working

I am using the GCS JSON API via Java and a Service Account. My code to insert objects, delete objects, and copy objects all works great. And I can successfully create new buckets with storage.buckets().insert() so long as the bucket name is NOT based on my domain name (i.e. creating bucket “454393-test-bucket” works, but creating bucket "test334.domain.com" does NOT work). Note that I CAN create domain name based buckets from the developer console when logged in as the project owner with no problem, and can also later insert/copy/delete objects from that bucket via the service account.
There must be something basic I am doing wrong.
Here is my code:
Bucket newBucket = new Bucket().setName(bucketName);
storage.buckets().insert(Utils.GAE_PROJECT_NAME, newBucket).execute();
Here is the error I get:
Uncaught exception from servlet
com.google.api.client.googleapis.json.GoogleJsonResponseException: 403
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "The bucket you tried to create is a domain name owned by another user.",
"reason" : "forbidden"
} ],
"message" : "The bucket you tried to create is a domain name owned by another user."
}
The account which verifies ownership of the bucket must be the same account that creates the bucket. If your account is the verified owner of your domain, your account must be used to create the bucket (and not a service account owned by a project owned by your account). When you are creating buckets from the developer console, you're using your own account, which has access.
Good news, though. You can add your service account to the list of owners of the domain, and it will gain this permission. On Webmaster Central, you can add and remove owners for domains.
Go to https://www.google.com/webmasters/verification/home?hl=en
Click on your domain
Click "Add an owner"
Put in the email address of the service account.
More on this is available in the Google Cloud Storage documentation: https://developers.google.com/storage/docs/bucketnaming#verification