Getting "Invalid Azure Credentials" trying to enable Mongo Atlas encryption at rest - mongodb

I'm trying to configure Atlas with Customer Key Management
which seems fairly straightforward. However, if I create a new service principal/app registration in Azure to connect Mongo Atlas to my Azure key vault I get Invalid Azure Credentials as an error.
I create the service principal following this guide without any redirect URI. I tried all different account types but none worked for me. After creation, I create a new client secret and use it in the 'Secret'-field inside the Mongo Atlas UI.
What am I doing wrong?

I forgot to add the service principal as a Key Vault Reader to the subscription that holds the key vault.
Relevant documentation: https://www.mongodb.com/docs/atlas/security-azure-kms/#prerequisites

Related

IBM Cloud API key could not be created

To configure my Terraform to use my IBM cloud I need to generate an API key and Classic infrastructure key. This gives the error "API key could not be created".
What settings needs to be changed in order for this to work?
Your cloud administrator needs to add the IAM Identity Service - Service ID creator rights

How to use Azure Data Factory, Key Vaults and ADF Private Endpoints together

I've created new ADF instance on Azure with Managed Virtual Network integration enabled.
I planned to connect to Azure Key Vault to retrieve credentials for my pipeline’s source and sink systems using Key Vault Private Endpoint. I was able to successfully create it using Azure Data Factory Studio. I have also created Azure Key Vault linked service.
However, when I try to configure another Linked Services for source and destination systems the only option available for retrieving credentials from Key Vault is AVK Linked Service. I'm not able to select related Private Endpoint anywhere (please see below screen).
Do I miss something?
Are there any additional configuration steps required? Is the scenario I've described possible at all?
Any help will be appreciated!
UPDATE: Screen comparing 2 linked services (one with managed network and private endpoint selected and another one where I'm not able to set this options up):
Managed Virtual Network integration enabled, Make sure check which region you are using unfortunately ADF managed virtual network is not supported for East Asia.
I have tried in my environment even that option is not available
So, I have gathered some information even if you create a private endpoint for Key Vault, this column is always shown as blank .it validates URL format but doesn't do any network operation
As per official document if you want to use new link service, instead of key vault try to create other database services like azure sql, azure synapse service like as below
For your Reference:
Store credentials in Azure Key Vault - Azure Data Factory | Microsoft Docs
Azure Data Factory and Key Vault - Tech Talk Corner

Encryption at Rest, MongoDB, Azure Key vault - unable to connect to your azure vault account

Firstly, I am trying to connect my Azure Key Vault to my Encryption at Rest using your Key Management. I followed the guide on MongoDB documentation linkhttps://docs.atlas.mongodb.com/security-azure-kms/
image from MongoDB Atlas Setup
What I've done so far which havn't worked.
I have set up the application and added the client secret, the application has the role "Azure key Vault Reader" assigned to it through the subscription.
I have setup the Key vault under the same subscription as above - with its own resource group to match it. And generated the key.
The key has all the operations.
So I have the Application with Vault Key Reader access and the Key Vault containing the key.
Client(Application) ID is filled with info from the application.
Tenant ID is filled with tenant ID from the application.
Secret is created and stored in the application - is added. (Not the ID)
Subscription ID copied from key vault is added.
Resource group name copied from key vault is added.
Key Vault Name copied from key vault is added.
lastly the Key Identifier is copied from the vault and added.
Still I get this error - is there something wrong with the way I went about it?
I feel I have tried everything combination of setup but it seems like the credentials are setup in a wrong way which I do not understand since it was all copied directly from Azure.
"We were unable to connect to your Azure Key Vault account. Please check your credentials and try again."
"We were unable to connect to your Azure Key Vault account. Please
check your credentials and try again."
As per #Matt Small suggestion in the comment section, if we enable Azure Key Vault logging, we can check if the issue is with wrong credentials or with access policy or network related issue.
If the issue is with access, we can provide the Key Vault Contributor role or add an access policy to get, list permissions for Keys and Secrets for the service principal (App Registration)
As per #Hurup comment, Azure Key Vault Reader role was not enough and the role should not be under Resource Group. Giving the Application higher vault role and setting it under the subscription can resolve the issue
I had the exact same issue. In the end I figured out that I did not have to create the role assignment 'Have an Active Directory Application with the role of Azure key Vault Reader assigned to it.' on the active directory app, but on the Key Vault.
I followed the manual from MongoDb and then in a final step did;
Go to Key Vault
Select key vault
Select Access Control (IAM)
Select Grant access to this resource
Select role Key Vault Reader
Assign access to :User, group, or service principal
+Select Members
Type Application name
Review and assign...
After this I could save the settings on MongoDB to use encryption at rest.

Can't use AWS IAM Roles with KMS Providers for MongoDB Client Side Field Level Encryption?

I am using EC2 Instance profile credentials for allowing the AWS EC2 instance to access other AWS services.
Recently, I implemented MongoDB Client-Side Field-Level Encryption for which the AWS KMS has been used as KMS Providers. The MongoDB Documentation for CSFLE mentions that the KMS Provider should have secret key and access key that maps to an IAM User.
This way I will have to create another IAM User and then maintain those credentials separately. A simpler way (and more secure) would have been to use the DefaultCredentialsProvider from software.amazon.awssdk:auth and that could have used the credentials from the instance profile that could have given access to the KMS. But this does not work for me and MongoClient fails as KMS rejects the security token used.
Is there any reason behind not allowing this way of accessing KMS?
As all projects, initial implementation of CSFLE had a scope. This scope did not include the ability to use instance roles for credential identification.
I suggest you submit your request to https://feedback.mongodb.com/ for consideration.

Having issue determining credentials used when connecting to SoftLayer ObjectStorage using SFTP

I'm having trouble connecting to the Bluemix Object Store using the instructions presented by this link: https://knowledgelayer.softlayer.com/procedure/connect-object-storage-using-sftp
It's unclear to me what the username and account ID are so I would appreciate it if someone can clarify
The instructions are valid
Where I can find the values for SLOS/IBMOS etc?
I do not have access to the Softlayer customer portal as this service as created in Bluemix.
I can confirm that an sftp server is listening at the appropriate region endpoint.
Brien, it is not possible to use SFTP to access the Bluemix Object Storage if you create it from the Services catalog area of the Bluemix UI:
https://console.ng.bluemix.net/catalog/services/object-storage
This one can be accessed via swift cli or REST API.
To use SFTP to access your Object Storage you need to create it from the Infrastructure are of the Bluemix UI - that is the legacy Softayer that is now integrated with Bluemix.
https://console.ng.bluemix.net/catalog/infrastructure/object_storage/
Also, to create the Object Storage from the Infrastructure catalog you need to first link your Bluemix and Softlayer accounts:
https://console.ng.bluemix.net/docs/admin/softlayerlink.html