I use the enstratus management/orchestration platform and I need to add the encoding key from my Azure account into it but I cant seem to find it, can anyone point me in the right direction please?
Thanks
Read all about it here. The basic gist is you'll have to create your own certificate locally and upload it to your Azure account.
http://docs.enstratius.com/clouds/azure/configuration.html?highlight=azure
For the new version of Enstratus, Dell Cloud Management, follow this procedure:
To add your Amazon account, you need the following information:
Account number
Access key
Secret access key
AWS certificate
AWS private key
If you don't have that information, follow these instructions to get your AWS account information.
To generate a new access keypair and X.509 certificate via the AWS web console:
Navigate to "Security Credentials" under your AWS account name.
Click on Account Identifiers at the bottom to find your AWS Account ID. Copy it to a notepad.
Click on "Create New Access Key". Note that AWS allows only two active keypairs.
Copy the Access Key ID and Secret Access Key.
Click on X-509 Certificates.
Download the Private Key File and X.509 Certificate.
Return to DCM and enter your AWS Account ID, Access Key ID, Secret Access Key, Certificate and Private Key.
Click Connect Cloud Account.
Related
To configure my Terraform to use my IBM cloud I need to generate an API key and Classic infrastructure key. This gives the error "API key could not be created".
What settings needs to be changed in order for this to work?
Your cloud administrator needs to add the IAM Identity Service - Service ID creator rights
Firstly, I am trying to connect my Azure Key Vault to my Encryption at Rest using your Key Management. I followed the guide on MongoDB documentation linkhttps://docs.atlas.mongodb.com/security-azure-kms/
image from MongoDB Atlas Setup
What I've done so far which havn't worked.
I have set up the application and added the client secret, the application has the role "Azure key Vault Reader" assigned to it through the subscription.
I have setup the Key vault under the same subscription as above - with its own resource group to match it. And generated the key.
The key has all the operations.
So I have the Application with Vault Key Reader access and the Key Vault containing the key.
Client(Application) ID is filled with info from the application.
Tenant ID is filled with tenant ID from the application.
Secret is created and stored in the application - is added. (Not the ID)
Subscription ID copied from key vault is added.
Resource group name copied from key vault is added.
Key Vault Name copied from key vault is added.
lastly the Key Identifier is copied from the vault and added.
Still I get this error - is there something wrong with the way I went about it?
I feel I have tried everything combination of setup but it seems like the credentials are setup in a wrong way which I do not understand since it was all copied directly from Azure.
"We were unable to connect to your Azure Key Vault account. Please check your credentials and try again."
"We were unable to connect to your Azure Key Vault account. Please
check your credentials and try again."
As per #Matt Small suggestion in the comment section, if we enable Azure Key Vault logging, we can check if the issue is with wrong credentials or with access policy or network related issue.
If the issue is with access, we can provide the Key Vault Contributor role or add an access policy to get, list permissions for Keys and Secrets for the service principal (App Registration)
As per #Hurup comment, Azure Key Vault Reader role was not enough and the role should not be under Resource Group. Giving the Application higher vault role and setting it under the subscription can resolve the issue
I had the exact same issue. In the end I figured out that I did not have to create the role assignment 'Have an Active Directory Application with the role of Azure key Vault Reader assigned to it.' on the active directory app, but on the Key Vault.
I followed the manual from MongoDb and then in a final step did;
Go to Key Vault
Select key vault
Select Access Control (IAM)
Select Grant access to this resource
Select role Key Vault Reader
Assign access to :User, group, or service principal
+Select Members
Type Application name
Review and assign...
After this I could save the settings on MongoDB to use encryption at rest.
This is a technical question on comprehension:
When I want a cloud server (e.g. GitHub Actions, Azure DevOps, or GitLab CI/CD) to build and publish an app to any of the app stores ... isn't it necessary then that I upload my private key to these servers' key vault, so they can sign my app on my behalf?
Isn't that concept a bit risky?
I mean, I was taught to let my private keys never leave my machines.
What if I accidentally misconfigure the security settings on the uploaded key? What if some black hat gets hold of the key and abuses it? I mean, with each build process, the private key is getting copied from the vault to the build runner, usually residing somewhere else.
What are the techniques used to ensure that private keys are kept safely on a public server? Is there an official audit performed on these departments?
Should I rather use different Authenticode certificates for each of the above providers? Or will a single certificate be resilient enough?
I couldn't find a technical discussion on this question, only marketing docs. Has this security concern been scientifically scrutinized?
It is safe to use certificates in Azure DevOps, Azure DevOps encrypts the certificate and then uses it in the build pipeline.
We could use Azure Key Vault to protect encryption keys and secrets like certificates, connection strings, and passwords in the cloud. You could refer this doc for more details.
We could also save the certificate file in the Library->Secure Files, if someone want to access or use it, he need enough permission.
I found some sample to use certificate in the Azure DevOps Service, you could check it.
If you are using SSL certificate thumbprint, you could save it in the variable and then set the variable to Secret via the “lock” icon at the end of the row.
If you are using PFX certificate, you could refer to this doc.
Update1
GitHub action
We could save the certificate here in the GitHub, GitHub also encrypts the certificate, we could not get the value after save the variable in the Secrets, and we will get *** when print the Secrets value in the log.
I want to use Google Cloud KMS for asymmetric signing. To complete setup with the destination provider I need to send them a CSR signed with the private key stored in Google. I've found some examples of doing this using Java or Go but I don't need to do it programatically and I don't know those languages anyway. Ideally I'm looking for something command line based using the SDK.
I gave up doing it this way and generated the private key and CSR on a local trusted machine then imported it into Google.
For every Google Compute instance, there is a default service account like this:
1234567890123-compute#developer.gserviceaccount.com
I can create my instance with the proper scope (i.e. https://www.googleapis.com/auth/devstorage.full_control) and use this account to make API requests.
On this page: https://cloud.google.com/storage/docs/authentication#service_accounts it says:
Every project has a service account associated with it, which may be used for authentication and to enable advanced features such as Signed URLs and browser uploads using POST.
This implies that I can use this service account to created Signed URLs. However, I have no idea how to create a signed URL with this service account since I can't seem to get the private key (.p12 file) associated with this account.
I can create a new, separate service account from the developer console, and that has the option of downloading a .p12 file for signing, but the project level service accounts do not appear under the "APIs and auth / Credentials" section. I can see them under "Project / Permissions", but I can't do anything with them there.
Am I missing some other way to retrieve the private key for these default accounts, or is there no way to sign urls when using them?
You can use p12 key of any of your service account while you're authenticated through your main account or a GCE service account or other services accounts that have appropriate permissions on the bucket and the file.
In this case, just create a service account download p12 key and use the following command to sign your URL:
$ gsutil signurl -d 10m privatekey.p12 gs://bucket/foo
Though you can authenticate using different service account using the following command:
gcloud auth activate-service-account service-account-email --key-file key.p12
You can list and switch your accounts using these commands:
$ gcloud auth list
$ gcloud config set account