I created new secure service fabric cluster on azure with cluster and admin client certificates in keyvault on azure. I installed the admin client certificate for current user and local machine stores but whenever I try to connect that cluster or explore it in browser its gave access denied error. I am also trying to connect from visual studio but it failed. In visual studio following is connection parameters:
<ClusterConnectionParameters ConnectionEndpoint="my.end.point.com:19000"
X509Credential="true"
ServerCertThumbprint="ClusterCertificateThumbPrint"
FindType="FindByThumbprint"
FindValue="AdminClientCertificateThumbPrint"
StoreLocation="CurrentUser"
StoreName="My" />
What I am doing wrong?
I experienced something similar, my issue was that I had the wrong servercertthumbprint. I created my service fabric cluster as part of the visual studio publish step and in that case the configuration looked like this:
<ClusterConnectionParameters ConnectionEndpoint="myservicefabricname:19000"
X509Credential="true"
ServerCertThumbprint="certicateThumbprint"
FindType="FindByThumbprint"
FindValue="certicateThumbprint"
StoreLocation="LocalMachine"
StoreName="My" />
The thumbprint used for the local certificate and the service fabric one has the same certificate thumbprint.
Additionally, it seems that even though I added the ClusterConnectionParameters in the xml config, when I went "Publish" and expanded "Advanced Parameters" I had to manually enter the values.
In case you don't know how to find the thumbprint you can follow this tutorial: https://learn.microsoft.com/en-us/dotnet/framework/wcf/feature-details/how-to-retrieve-the-thumbprint-of-a-certificate
Related
I'm facing issues when I deploy my service fabric app to cluster. Deployment completed with below error.
Error event: SourceId='System.Hosting', Property='Activation:1.0'.
There was an error during activation.Failed to configure certificate
permissions. Error FABRIC_E_CERTIFICATE_NOT_FOUND.
This certificate is part of config package. However when I login to VM, I don't see the package deployed with config folder. I checked that package which is built locally, it looks correct with Config folder and certificate. But when this package is deployed to cluster then Config pkg is missing on VM and hence service activation fails with Certificate not found error.
All the details can be found in the issue I have logged on GitHub.
Snippet from ApplicationManifest which refers to certificate:
<ContainerHostPolicies CodePackageRef="Code">
<CertificateRef Name="SecretsCert" DataPackageRef="Config" DataPackageVersion="Version" RelativePath="PFX PATH INSIDE CONFIG" IsPasswordEncrypted="true" Password="NOTMYPASSWORD"/>
<RepositoryCredentials AccountName="Container Registry Name" Password="[Registry Key]" PasswordEncrypted="true" />
</ContainerHostPolicies>
I need some help with deploying a Service fabric app from Team Services to Azure.
I’m getting the following error from the Agent in Team Services (see screenshot below):
2018-06-22T13:17:13.3007613Z ##[error] An error occurred attempting to
import the certificate. Ensure that your service endpoint is
configured properly with a correct certificate value and, if the
certificate is password-protected, a valid password.
Error message: Exception calling "Import" with "3" argument(s):
"Cannot find the requested object.
Please advise.
Here is my Service Fabric Security security page, don't remember where I set up the password needed on the VSTS side but I took note of it and believe it's correct.
Here is the Endpoint page on the VSTS side:
Issue resolved with the help of MS Support by creating a new Certificate in the Key Vault and Adding it to the Service Fabric, steps:
Azure Portal:
Home > Key vaults > YourKeyVault - Certificates: Generate/Import
Generate new key with a CertificateName of your choosing and CN=CertificateName as Subject.
Home > Key vaults > YourKeyVault - Certificates > CertificateName
Select the only version available and Download in PFX/PEM format.
Power Shell: Convert to Base64 string, CertificateBase64
[System.Convert]::ToBase64String([System.IO.File]::ReadAllBytes("c:\YourCertificate.pfx"))
Home > YourServicefabric - Security: Add
Add the Certificate you created as Admin Client by providing 's thumbprint.
VSTS/TFS:
Build and release > Your pipeline: Edit
In the Deployment Process Service Fabric Environment click Manage for Cluster Connection and add a new connection. Besides the other information, in the Client Certificate paste the previous CertificateBase64.
Check the Service Endpoint in VSTS:
Whether it has a properly base64 encoded certificate, with a private key.
Also, check if the provided passphrase is correct.
Also, check if the service endpoint is configured as tcp://mycluster.region.cloudapp.azure.com:19000.
Check if the thumbprint is correct.
My cluster is using Windows Authentication for client to endpoint and it works as expected in browser when connecting to cluster by prompting to use username/password to connect to SF console.
I'm confused as far as what I'm supposed to put into publish profile in Visual Studio. There is no option to choose Windows Authentication anywhere, only cert and Azure Active Directory. How is it supposed to work?
Using WindowsCredential="True" in the publish profile should work. Here is a powershell version that has worked for me in the past.
Connect-ServiceFabricCluster -ConnectionEndpoint:':19000' -WindowsCredential:$true -TimeoutSec:60
I am stumped on this one. I have a secure cluster with some encrypted application settings. The app runs fine on my local cluster, but not when deployed to the cloud.
The application deploys ok, but fails to start up with the following error: Failed to ACL folders or certificates required by application. Error:FABRIC_E_CERTIFICATE_NOT_FOUND.
I have created a self signed cert, exported it (with the private key) to a PFX, and uploaded it to the vault:
New-SelfSignedCertificate -Type DocumentEncryptionCert -KeyUsage DataEncipherment -Subject mycert -Provider 'Microsoft Enhanced Cryptographic Provider v1.0'
Invoke-AddCertToKeyVault -SubscriptionId 'xxxxx-bxxxxfb9-xxxx-xxx-xxxxx' -ResourceGroupName 'vault-sec-studio-dev' -Location "Central US" -VaultName 'vault-sec-studio-dev' -CertificateName 'mycert' -Password "myPass" -UseExistingCertificate -ExistingPfxFilePath "C:\temp\Azure\Dev\mycert.pfx"
I add the certificate to the cluster security tab by referencing the thumbprint:
I update the ApplicationManifest.xml:
<Principals>
<Users>
<User Name="Service1" AccountType="NetworkService" />
</Users>
</Principals>
<Policies>
<SecurityAccessPolicies>
<SecurityAccessPolicy GrantRights="Read" PrincipalRef="Service1" ResourceRef="mycert" ResourceType="Certificate"/>
</SecurityAccessPolicies>
</Policies>
<Certificates>
<SecretsCertificate X509FindValue="72C57495F3034E072CA6F536EEABE984AA869CBC" X509StoreName="My" X509FindType="FindByThumbprint" Name="mycert" />
</Certificates>
The Sevice Fabric Explorer page shows the upgrade was installed but failed to start. Several nodes are in error status:
When remoting in to the VM's I don't see this cert installed. I see the main cert used to secure the cluster, but not this admin cert.
I have tried to manually install the cert on each VM, but getting the same result.
I have spent a ton of time on this, and can't seem to get anywhere, so I'm hopeful someone can give me some pointers here.
Update:
I'm seeing this in the event log on the VM. Talks about the private key which makes me think there is something wrong with the cert, or pfx?
Failed to get the Certificate's private key. Thumbprint:72C57495F3034E072CA6F536EEABE984AA869CBC. Error: FABRIC_E_CERTIFICATE_NOT_FOUND
This issue is now finally resolved. I can't say I completely understand this, but this is what I found:
Don't create the Service Fabric Cluster using the Portal. You'll need to use the template so you have access to configure the certs.
Also no need to mess around with Admin Certs on the security tab that I originally did (see original question). Those don't work for this, or at least not the way I'd expect them to.
You must edit the ARM template and add the following certificate information to the "secrets" array on each VM:
Note section above is added and points to a Certificate URL
"settingCertificateUrlValue": {
"value": "https://vault-my-site.vault.azure.net:443/secrets/studiosecrets/487a94749ee148979cc97a68abe9fd3a"
},
The cluster is now "green" and the app runs fine.
Problem
I am trying to deploy a worker role which will autoscale a few target sites. I am able to run the autoscaler locally and it works (I installed the certificates on my machine). However, it won't autoscale once I deploy it to Azure as a cloud app. (However, the worker role is running because I can see my non-autoscaling processes working in the same worker role.)
What I tried
I have followed the Deploying the Autoscaling Application Block instructions.
Added the "CN=Windows Azure Tools" certificate to the management certificates of the target subscription.
Added the "CN=Windows Azure Tools" certificate to the autoscaling application's certificates.
Specified the location of my cert in the worker role
Specified the location of the cert in my service store for configuring autoscaling
What am I missing?
Thanks
Tuzo is right - cert should be in LocalMachine, but that's not enough. See this SO post. Basically, in OS Family 2, WaWorkerHost is running under a temporary account (with a GUID name) generated by Role initialization process, this account has permission to access certificate private key; In OS Family 3, WaWorkerHost is running under “NETWORK SERVICE” account, this account doesn’t have private key access permission.
Best option for now (MS Azure team addressing issue in next SDK) is to run the role with elevated privileges - edit ServiceDefinition.csdef:
<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="blah" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition" schemaVersion="2012-10.1.8">
<WorkerRole name="blah" vmsize="Small">
<Runtime executionContext="elevated" />
...
</WorkerRole>
</ServiceDefinition>
For running in Azure I would try setting the Store Location to be LocalMachine.
If you've followed all the steps in the Deploying the Autoscaling Application Block then the certificate with the private key (.pfx) should be deployed in the role. You can RDP into the server to verify that the certificate is installed (and the location).
You can also try enabling logging as per Autoscaling Application Block Logging to see if there are any messages.