I build service fabric application and I want to secure secrets in Azure Key vault, I implement the same steps I do for app service but it doesn't work, appreciating your replay.
For App Service:
1. Configure Key Vault on Main Method
2. Enable assigned managed identity on App Service, applied on Scale set for SF.
3. Add access policy on the key vault.
1) Azure configuration (VM Scale set + Key vault):
Login-AzureRmAccount # Login into Azure account
$targetRg = "testfabric-rg" # Target resource group name
$targetVmss = "jxewcyinq" # Target virtual machine scale set name
$targetKeyVault = "az-ure-two20190115153549" # Target Key Vault name
# 1. Enable Managed Identity for target Virtual Machine Scale Set
Update-AzureRmVmss `
-ResourceGroupName $targetRg `
-VMScaleSetName $targetVmss `
-IdentityType SystemAssigned
# 2. Retrieve virtual machine scale set
$vmss = Get-AzureRmVmss `
-ResourceGroupName $targetRg `
-Name $targetVmss
# 3. Create new Key vault access policy allowing Virtual Machine Scale Set to read secrets by their IDs
Set-AzureRmKeyVaultAccessPolicy `
-VaultName $targetKeyVault `
-ObjectId $vmss.Identity.PrincipalId `
-PermissionsToSecrets Get # set only necessary permissions!
2) Get key vault secret when using C#:
// https://www.nuget.org/packages/Microsoft.Azure.KeyVault/
using Microsoft.Azure.KeyVault;
// https://www.nuget.org/packages/Microsoft.Azure.Services.AppAuthentication
using Microsoft.Azure.Services.AppAuthentication;
public async Task<string> GetSecretById(string id)
{
// URL of the target Key Vault
var keyVaultUrl = "https://az-ure-two20190115153549.vault.azure.net";
var azureServiceTokenProvider = new AzureServiceTokenProvider();
var keyVaultClient = new KeyVaultClient(
new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
var secret = await keyVaultClient.GetSecretAsync($"{keyVaultUrl}/secrets/{id}");
return secret.Value;
}
Related
I'm fairly new to Service Fabric, so I'm not sure if this is an issue with the cmdlet or if this is a miss on my part. I am using Az.ServiceFabric module version 2.0.2 and the Az module version 3.8.0.
I am trying to use the Add-AzServiceFabricClusterCertificate cmdlet to add a secondary certificate that I've already created in my Azure KeyVault to my cluster. When I run the cmdlet, it fails with this error (running with Debug gave me more stack detail):
DEBUG: AzureQoSEvent: CommandName - Add-AzServiceFabricClusterCertificate; IsSuccess - False; Duration -
00:00:07.3059582;; Exception - System.NullReferenceException: Object reference not set to an instance of an object.
at Microsoft.Azure.Commands.ServiceFabric.Commands.ServiceFabricClusterCmdlet.GetClusterType(Cluster
clusterResource)
at Microsoft.Azure.Commands.ServiceFabric.Commands.AddAzureRmServiceFabricClusterCertificate.ExecuteCmdlet()
at Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePSCmdlet.ProcessRecord();
Looking at the code for this cmdlet, I noticed that it's probably failing because the cluster resource that gets passed into GetClusterType does not have its Certificate member, so it fails when it tries to check the Certificate.Thumbprint and Certificate.ThumbprintSecondary:
internal ClusterType GetClusterType(Cluster clusterResource)
{
if (string.IsNullOrWhiteSpace(clusterResource.Certificate.Thumbprint) &&
string.IsNullOrWhiteSpace(clusterResource.Certificate.ThumbprintSecondary))
{
return ClusterType.Unsecure;
}
else
{
return ClusterType.Secure;
}
}
The cluster that gets passed into GetClusterType is retrieved in the same manner as in the Get-AzServiceFabricCluster cmdlet, so when I run that cmdlet for the cluster that I'm trying to add the certificate to, I noticed that my Certificate field is empty in the response. I'm guessing that's what's causing the NullRef exception. Here's that relevant snippet:
AzureActiveDirectory :
TenantId : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClusterApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClientApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Certificate :
CertificateCommonNames : Microsoft.Azure.Management.ServiceFabric.Models.ServerCertificateCommonNames
ClientCertificateCommonNames :
ClientCertificateThumbprints :
I'm wondering if it's expected that the Certificate field would be empty when I'm using the Get-AzServiceFabricCluster cmdlet and if that is indeed the cause of my Add-AzServiceFabricClusterCertificate cmdlet failing. When I look at the cluster's Security blade in Azure Portal, I do see the Primary Cluster Certificate with which I originally created the cluster, and this is the cert that I use when deploying and doing other cluster operations. However, I did notice that the cert thumbprint field is empty when viewing the certificate from the portal. I would expect to see this certificate when using Get-AzServiceFabricCluster, but it comes up empty. Is this certificate missing from my Get-AzServiceFabricCluster cmdlet possible to fix through the portal or with another cmdlet?
It looks like your cluster is configured to find certificates by common name, rather than thumbprint. I'm guessing this based on the fact your portal doesn't show a thumbprint against the certificate, in addition to the snippet you have posted.
If this is the case, there's no need to update your cluster configuration with a new certificate when the old certificate has expired - Instead you need to install the certificate only into your VMSS vault. Once you add the new certificate to the VMSS, Service Fabric will automatically use the later expiring certificate.
You must always ensure you have at least one valid certificate installed on your VMSS with the common name configured in your cluster.
PS to upload certificate KV and install onto VMSS:
$subscriptionId = "sub-id"
$vmssResourceGroupName = "vmss-rg-name"
$vmssName = "vmss-name"
$vaultName = "kv-name"
$primaryCertName = "kv-cert-name"
$certFilePath = "...\.pfx"
$certPassword = ConvertTo-SecureString -String "password" -AsPlainText -Force
# Sign in to your Azure account and select your subscription
Login-AzAccount -SubscriptionId $subscriptionId
# Update primary certificate within the Key Vault
$primary = Import-AzKeyVaultCertificate `
-VaultName $vaultName `
-Name $primaryCertName `
-FilePath $certFilePath `
-Password $certPassword
$certConfig = New-AzVmssVaultCertificateConfig -CertificateUrl $primary.SecretId -CertificateStore "My"
# Get VM scale set
$vmss = Get-AzVmss -ResourceGroupName $vmssResourceGroupName -VMScaleSetName $vmssName
# Add new certificate version
$vmss.VirtualMachineProfile.OsProfile.Secrets[0].VaultCertificates.Add($certConfig)
# Update the VM scale set
Update-AzVmss -ResourceGroupName $vmssResourceGroupName -Verbose `
-Name $vmssName -VirtualMachineScaleSet $vmss
For more info, I wrote a blog post on switching from thumbprint to common name.
The official docs are also a good reference.
I'm trying to create a powershell script to backup a SQL database on Azure to a storage account as below,
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName
$ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey
$StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
This is the document i'm following,
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export
I assume the following,
$ResourceGroupName - my azure resource group
$ServerName - db server name
$DatabaseName - database name
**$StorageKeytype - NOT SURE WHAT VALUE SHOULD BE PLACED HERE**
**$StorageKey - I'm hoping this is one of the access keys under the azure storage account**
$BacpacUri - Azure storage account bacpac URI path
Please advice what parameters need to passed here.
Please advice what parameters need to passed here.
StorageKey : Specifies the access key for the storage account.
StorageKeyType:
Specifies the type of access key for the storage account.
The acceptable values for this parameter are:
StorageAccessKey. This value uses a storage account key.
SharedAccessKey. This value uses a Shared Access Signature (SAS) key.
For more details, refer to this link.
How would you go on to secure your DSC configuration that's using credentials properly in Azure Automation?
E.g.:
configuration MyServer {
param(
[Parameter(Mandatory=$true)][PSCredential]$MyCredential
);
# Some configuration using credentials
}
Normally I'd set up a public key and a proper certificate installed on each node and pass along CertificateFile and Thumbprint to ConfigurationData when compiling the configuration documents.
In Azure Automation I can't find any good solution.
The documentation says Azure automation encrypts the entire MOF by it's own : https://azure.microsoft.com/en-us/documentation/articles/automation-certificates/ and the article specifies to use PSAllowPlainTextCredentials.
When you then register a node to it's pull server and pull it's configuration, you can read out the password in plain text as long as you have local admin / or read access to temp, after being pulled down/updated. This is not good in a security perspective.
What I'd like, ideally would be to upload such a public key/certificate to the Azure Automation credentials and use it as a part of ConfigurationData when starting the compilation job.
However today, "CertificateFile" expects a path and not a AutomationCertificate so I cannot see a way to start the compilationjob with any public key present in Azure Automation. I can't see any ways of referring to my assets certificate when running the job.
Any ideas if this is possible in the current state of Azure automation and the way they work with DSC/pull to secure it properly using either an asset store din Azure Automation or Azure Key vault?
You should create a Azure Automation Credential and reference it in the configuration like so:
# Compile mof
$ConfigurationData = #{
AllNodes = #(
#{
NodeName = $nodeName
PSDscAllowPlainTextPassword = $true
}
)
}
$Parameters = #{
"nodeName" = $nodeName
"credential" = $credName ##### Notice this is only the name on Azure Automation Credential Asset, Azure Automation will securely pull those and use in the compilation
}
Start-AzureRmAutomationDscCompilationJob -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName `
-ConfigurationName $configurationName -Parameters $Parameters -ConfigurationData $ConfigurationData
You should not worry about the PSDscAllowPlainTextPassword since Azure Automation does encrypt everything at rest for you, it's just that DSC doesn't know that (so you have to supply that to the DSC engine).
And in DSC you should have something like:
Configuration name
{
Param (
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][String]$nodeName,
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][pscredential]$credential
)
Import-DscResource -Module modules
Node $nodeName {DOSTUFF}
}
The correct way to pass a credential to a DSC file from Azure Automation is to use an Azure Automation Credential.
Then inside your DSC file you use the command Get-AutomationPSCredential
Example:
Configuration BaseDSC
{
Import-DscResource -ModuleName xActiveDirectory
Import-DscResource -ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName XNetworking
$Credential = Get-AutomationPSCredential -Name "CredentialName"
Node $AllNodes.Nodename
{ ...
The credential is stored encrypted in Azure Automation, and is put into the encrypted MOF file in Azure Automation when you run the compilation job.
Additionally, the password can be updated in Azure Automation and then updated in the MOFs by just recompiling.
The password is not able to be retrieved in clear text from Azure.
Use secure credential and create user to Windows server and add to Administrator group using dsc:
**Solution ( PowerShell DSC) **
Firstly : Create credential in Automation account form azure portal or using any azure module
Home>Resource group> ...> automation account > credentials
Configuration user_windows_user
{
param
(
[Parameter()][string]$username,
[Parameter()]$azurePasswordCred **#name (string)of the credentials**
)
$passwordCred = Get-AutomationPSCredential -Name $azurePasswordCred
Node "localhost"
{
User UserAdd
{
Ensure = "Present" # To ensure the user account does not exist, set Ensure to "Absent"
UserName = $username
FullName = "$username-fullname"
PasswordChangeRequired = $false
PasswordNeverExpires = $false
Password = $passwordCred # This needs to be a credential object
}
Group AddtoAdministrators
{
GroupName = "Administrators"
Ensure = "Present"
MembersToInclude = #($username)
}
}
} # end of Configuration
#
$cd = #{
AllNodes = #(
#{
NodeName = 'localhost'
PSDscAllowPlainTextPassword = $true
}
)
}
Upload the Dsc file in azure automation>configuration
compile the configuration (provide input -username , credential name (string)
add configuration to node and wait for the configuration deployment
Anyone installed either Microsoft Malware Protection or Symmantec End Point Protection on the Service Fabric VM's. The Azure Security Center says it's possible, but I haven't been able to get it to work.
When you create the cluster, there is no extension option to add malware protection (that I could find). After you create the cluster, when you RDP into the servers, PowerShell Get-AzureRmVm can't find the ServiceName to use PowerShell to install the anti-malware. (I can get both those options to work on standalone VM's)
I'm thinking I'm missing something really simple, but I'm not seeing it.
Generally this is VM level config and so is usually managed via a custom VM image that already has things set up or via a VM extension. There's guidance around setting up antimalware in a cluster here.
# Script to add Microsoft Antimalware extension to VM Scale Set(VMSS) and Service Fabric Cluster(in turn it used VMSS)
# Login to your Azure Resource Manager Account and select the Subscription to use
Login-AzureRmAccount
# Specify your subscription ID
#$subscriptionId="SUBSCRIPTION ID HERE"
Select-AzureRmSubscription -SubscriptionId $subscriptionId
# Specify location, resource group, and VM Scaleset for the extension
#$location = "LOCATION HERE" # eg., “West US or Southeast Asia” or “Central US”
#$resourceGroupName = "RESOURCE GROUP NAME HERE"
#$vmScaleSetName = "YOUR VM SCALE SET NAME"
# Configuration.JSON configuration file can be customized as per MSDN documentation: https://msdn.microsoft.com/en-us/library/dn771716.aspx
#$settingString = ‘{"AntimalwareEnabled": true}’;
# retrieve the most recent version number of the extension
$allVersions= (Get-AzureRmVMExtensionImage -Location $location -PublisherName “Microsoft.Azure.Security” -Type “IaaSAntimalware”).Version
$versionString = $allVersions[($allVersions.count)-1].Split(“.”)[0] + “.” + $allVersions[($allVersions.count)-1].Split(“.”)[1]
$VMSS = Get-AzureRmVmss -ResourceGroupName $resourceGroupName -VMScaleSetName $vmScaleSetName
Add-AzureRmVmssExtension -VirtualMachineScaleSet $VMSS -Name “IaaSAntimalware” -Publisher “Microsoft.Azure.Security” -Type “IaaSAntimalware” -TypeHandlerVersion $versionString
Update-AzureRmVmss -ResourceGroupName $resourceGroupName -Name $vmScaleSetName -VirtualMachineScaleSet $VMSS
The Service Fabric team does have guidance on how to configure your environment that includes the information about the exclusions you'd want to add. Those include:
Antivirus Excluded directories
Program Files\Microsoft Service Fabric
FabricDataRoot (from cluster configuration)
FabricLogRoot (from cluster configuration)
Antivirus Excluded processes
Fabric.exe
FabricHost.exe
FabricInstallerService.exe
FabricSetup.exe
FabricDeployer.exe
ImageBuilder.exe
FabricGateway.exe
FabricDCA.exe
FabricFAS.exe
FabricUOS.exe
FabricRM.exe
FileStoreService.exe
In the Azure Portal I can create an Application, Key and Permissions to the Graph API.
I can get a Token using:
AuthenticationContext ac = new AuthenticationContext("https://login.windows.net/graphDir1.onmicrosoft.com");
ClientCredential cc = new ClientCredential("b3b1fc59-84b8-4400-a715-ea8a7e40f4fe", "FStnXT1QON84B5o38aEmFdlNhEnYtzJ91Gg/JH/Jxiw=");
AuthenticationResult authResult = ac.AcquireToken("https://graph.windows.net", cc);
Using the Azure Active Directory Module for Windows PowerShell I can create a new Symmetric Key.
New-MsolServicePrincipalCredential -AppPrincipalId ??? -Type Symmetric
Using the key returned from this in the code above returns the error:
AdalServiceException: AADSTS70002: Error validating credentials. AADSTS50012: Invalid client secret is provided.
This used to work with a previous version of ADAL using instead of ClientCredential, SymmetricKeyCredential but that class no longer exists.
Is there a way to generate a key from PowerShell that works with the code above?
Please try using Password as the key type:
New-MsolServicePrincipalCredential -AppPrincipalId $appId `
-Type Password `
-StartDate ([DateTime]::Now.AddMinutes(-5)) `
-EndDate ([DateTime]::Now.AddMonths(1)) `
-Value "$newPassword"
Hope this helps