I've tried searching for a solution to this for half a day and decided I need to ask the question myself.
My goal is to remotely control standalone Windows servers via PowerShell on our internal network. Our environment is based on MicroFocus eDirectory instead of MS Active Directory so our servers are not connected via GPOs.
Since PowerShell have existed for such a long time and you can control commandline only installs via this I assumed there would be a solution in a form of certificate authentication client to server but I've yet to find anything resembling this.
I'm aware of workarounds including creating private keys to store decrypt in scripts but we do not want to risk losing such information and wish to be able to create certificates to both new clients and servers without having to include any form of credentials in scripts.
Is there no way to simply use certificates to authenticate in place of credentials?
You should take a look at this chapter in PS remoting book, it describes exactly what you need. https://devopscollective.gitbooks.io/secrets-of-powershell-remoting/content/manuscript/accessing-remote-computers.html
The certificate auth part starts in the middle somewhere, look for "Certificate Authentication"
Related
We have a bunch of Windows server applications that currently handle secrets as follows; our apps are in C#.
We store them in settings files in code
We store them encrypted, using a certificate
The servers have this certificate with the private key, so they can decrypt the secret
We're looking at implementing Hashicorp Vault. It seems easy enough to simply replace the encrypt-store-decrypt with storing the secret in Vault in the KV engine, and just grabbing it in our apps - that takes that certificate out of the picture entirely. Since we're on-prem, I'll need to figure out our auth method.
We have different apps running on different machines, and it's somewhat dynamic (not as much as an autoscaling scenario, but not permanent - so we can't just assign servers to roles one time and depend on Kerberos auth).
I'm unsure how to make AppRole work in our scenario. We don't have one of the example "trusted platforms" or "trusted entities", there's no Nomad, Chef, Terraform, etc. We have Windows machines, in a domain, and we have a homegrown orchestrator that could be queried to say "This machine name runs these apps", so maybe there's something that can be done there?
Am I in "write your own auth plugin" territory, to speak to our homegrown orchestrator?
Edit - someone on Reddit suggested that this is a simple solution if our apps are all 1-to-1 with the Windows domain account they run under, because then we can just use kerb authentication. That's not currently the way we're architected, but we've got to solve this somehow, and that might do it nicely.
2nd edit - replaced "services" with "apps", since most of our services aren't actually running as Windows services, just processes. The launcher is a Windows service but the individual processes it launches are not.
How about Group Managed Service Accounts?
https://learn.microsoft.com/en-us/windows-server/security/group-managed-service-accounts/group-managed-service-accounts-overview
Essentially you created one "trusted platform" (to your key vault service).
Your service can still has its own identity but delegation to the gMSA when you want to retrieve the secrets.
For future visibility, here's what we landed on:
TLS certificate authentication. Using Vault, we issue a handful of certs, each will correspond to a security policy/profile, so that any machine that holds that certificate will be able to authenticate and retrieve the secrets they should have access to.
Kerberos ended up being a dead-end for two reasons. The vault.exe agent (which is part of this use case) can't use the native Windows Kerberos SSPI, so we'd have to manage and distribute keytab files. Also, if we used machine authentication, it would blow up our client count (we're using the cloud-hosted HCP Vault, where pricing is partially based on client count).
Custom plugins can't be loaded into the HCP, of course
Azure won't work, it requires Managed Identities which you can't assign to on-prem machines. Otherwise this might have been a great fit
So I can successfully run commands to manage our Microsoft 365/AzureAd/Exchange Online - this involves assigning and removing license, converting user to a shared mailbox, delegating access to a mailbox, etc. I followed the guide here for authentication. But that's me actually logging in with my credentials + MFA (Multi-factor authentication) for authentication.
I want to have a script that does these type of actions triggered by a schedule. I believe I can include the credentials but how to do MFA? Tried to follow this but getting error clientid is not a guid I have registered an app in https://portal.azure.com/ and able to do Graph API calls using that. No luck in PowerShell authentication though. Any thoughts? Thanks!
Maybe try this? It should allow you to connect to all Microsoft online services and includes support for MFA. If it does not work, the website has many other scripts you can try
This is not possible. A potential solution is to set some rules where in specific case, MFA will not be required.
We have a PowerShell script to pull Power BI activity data (using Get-PowerBIActivityEvent), and I have been trying to automate it so that it can pull this data daily using an unattended account. The problem is the script must necessarily use the Connect-PowerBIServiceAccount cmdlet, which requires a credential. I don't want to have the passwords hard-coded anywhere (obviously) and ideally don't want to be passing it into the script as a plaintext parameter in case of memory leaks.
I've tried using SSIS as a scheduling mechanism since it allows for encrypted parameters in script tasks, but can't call the PS script with a SecureString parameter since the System.Management.Automation namespace isn't in the GAC (a commandline call wouldn't be possible).
I don't believe task scheduler would offer the functionality needed.
Does anyone know of any elegant ways to connect to the power BI service using encrypted credentials?
In the docs of Connect-PowerBIServiceAccount there are 2 options for unattended sign-in:
Using -Credential, where you pass AAD client ID as username and application secret key as password
Using -CertificateThumbprint and -ApplicationId
For both options you need to configure service pricipal and add proper permissions. I'm not going into details how to configure that, but most probably you'd need (at least) the following application permissions:
I'm not really sure what functionalities you need in the script, but in my experience, majority of the cases can be covered by scheduled task, so the explanation below will apply to that solution.
How you can secure the credentials?
There are variuos possible solutions, depending on your preferences. I'd consider certificate-based authentication as more secure (certificate is available only to current user/all users of the machine).
What's important in certificate-based authentication - make sure that the certificate is available for the account running the script (in many cases it's service account, not your user account).
How can I secure more?
If you want, you can store application ID as secure string (I don't have SSIS to test, so I'm not sure if there's any workaround to make it working in there) or use Export-CliXml. They use Windows Data Protection API (DPAPI), so the file can be decrypted only by the account which was used to encrypt.
To add one more level of security (I'm not even mentioning setting correct access rights to the files as it's obvious) you might put the file in the folder encrypted (you might already have a solution for disk encryption, so use it if you wish).
There are probably some solutions to secure the keys even better, but these ones should do the job. I'm using other Microsoft 365 modules with similar approach (Outlook, SharePoint PnP) and it works quite well.
NOTE: If you need to use user account, instead of service principal, make sure that you have MultiFactor Authentication disabled on that account for that specific application.
The relevant documentation to this (https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal) states that admin APIs (i.e. those served via Get-PowerBiActivityEvent) do not currently support service principals. This means it's not currently possible to use a registered app to run these cmdlets unattended.
There is a feature request open to provide this at the moment: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/39641572-need-service-principle-support-for-admin-api
WMF 5.1 includes new functionality to allow signing of MOF documents and DSC Resource modules (reference). However, this seems very difficult to implement in reality -- or I'm making it more complicated than it is...
My scenario is VMs in Azure and I'd like to leverage Azure Automation for Pull DSC Server; however, I see this applying on premise too. The problem is that the certificate used to sign the MOF configurations and/or modules needs to get placed on the machine before fetching and applying the configuration otherwise configuration will fail because the certificate isn't trusted or present on the machine.
I tried using Azure KeyVault to bootstrap the certificate (just the public key because that's my understanding of how signing works) and that fails using Add-AzureRmVMSecret because the CertificateUrl parameter expects a full certificate with the public/private key pair to install. In an ideal world, this would be the solution but that's not the case...
Other ideas, again in this context, would be to upload the cert to blob storage, use a CustomScriptExtension to pull down the cert and install into the LocalMachine store but that feels nasty as well because, ideally, that script should be signed as well and that puts us back in the same spot.
I suppose another idea would be to first PUSH a configuration that downloaded and installed certificates only but that doesn't sound great either.
Last option would be to rely on an AD GPO or something similar to potentially push the certificate first...but, honestly, trying to move away from much of that if/when possible...
Am I off-base on this? It seems like this should be a solvable problem -- just looking for at least one "good" way of doing it.
Thanks
David Jones has quite a bit of experiencing dealing with this issue in an on-premises environment, but as you stated the same concepts should apply for Azure. Here is a link to his blog. This is a link to his GitHub site with a PKITools module that he created. If all else fails you can reach out to him on Twitter.
While it's quite easy to populate a pre-booted image with public certificates. it's not possible (that I have found) to populate the private key.
DSC would require the private key to decrypt the passwords.
The most common tactic people blog about is to use the unattend to script the import of a PFX. issue there is you have to leave the password for the PFX in plain text. Perhaps that is ok in your environment.
The other option requires a more complicated setup. Use a simple DSC or GPO to auto enroll a unique certificate. then have the system, via first boot script or DSC custom resource, tickle an API (Like Polaris) and that triggers a DSC script that uses PKITools or other script to get the public certificate that the machine has. Then have that API push a new DSC config (or pull settings) to the machine.
I have been maintaining an installation for a while but I am not really an expert. now I've been asked to come up with a solution for this:
Our software is always sold together with a computer as it has to be run in a very controlled environment. The installer needs administrative privileges to be executed. So far we had two different users, one with administrative rights and other one without. Our custumer service login as Administrator, install the software and restart the machine so that the user can access as a normal user.
Now we want the user to be able to install the software themselves but we don't want them to have access as an administrator because they can modify things it shouldn't be modified.
So, is there any way to programmatically raise the user privileges during the installation and afterwards lower them back? The installer is made using InstallShield but we use vbscript to check some pre-requisites.
Check out CPAU. It allows you to create an encrypted command that will run the installation as administrator.
EDIT: This is a more comprehensive list of like tools.
If you are looking for a toolkit to do this kind of thing, well, Microsofts MSI technology has this built in: Administrator access is required to install the initial MSI file, additional patches (MSPs I think) are digitally signed by the original MSI and are thus deemed safe - users can install them without requiring administrator elevation.
You can do the same thing: As part of your administrative install, install a service. The service can create a named pipe - that you explicitly give user ACLs to - or even just a socket or monitor a drop off folder that allows the user level code to communicate with the service code (running with SYSTEM or configured access). The service can then use its SERVICE or configured account level permissions to either impersonate an administrator, or do other tasks on the behalf of the user without EVER giving the user any kind of elevated permission - even temporarily.