We have a PowerShell script to pull Power BI activity data (using Get-PowerBIActivityEvent), and I have been trying to automate it so that it can pull this data daily using an unattended account. The problem is the script must necessarily use the Connect-PowerBIServiceAccount cmdlet, which requires a credential. I don't want to have the passwords hard-coded anywhere (obviously) and ideally don't want to be passing it into the script as a plaintext parameter in case of memory leaks.
I've tried using SSIS as a scheduling mechanism since it allows for encrypted parameters in script tasks, but can't call the PS script with a SecureString parameter since the System.Management.Automation namespace isn't in the GAC (a commandline call wouldn't be possible).
I don't believe task scheduler would offer the functionality needed.
Does anyone know of any elegant ways to connect to the power BI service using encrypted credentials?
In the docs of Connect-PowerBIServiceAccount there are 2 options for unattended sign-in:
Using -Credential, where you pass AAD client ID as username and application secret key as password
Using -CertificateThumbprint and -ApplicationId
For both options you need to configure service pricipal and add proper permissions. I'm not going into details how to configure that, but most probably you'd need (at least) the following application permissions:
I'm not really sure what functionalities you need in the script, but in my experience, majority of the cases can be covered by scheduled task, so the explanation below will apply to that solution.
How you can secure the credentials?
There are variuos possible solutions, depending on your preferences. I'd consider certificate-based authentication as more secure (certificate is available only to current user/all users of the machine).
What's important in certificate-based authentication - make sure that the certificate is available for the account running the script (in many cases it's service account, not your user account).
How can I secure more?
If you want, you can store application ID as secure string (I don't have SSIS to test, so I'm not sure if there's any workaround to make it working in there) or use Export-CliXml. They use Windows Data Protection API (DPAPI), so the file can be decrypted only by the account which was used to encrypt.
To add one more level of security (I'm not even mentioning setting correct access rights to the files as it's obvious) you might put the file in the folder encrypted (you might already have a solution for disk encryption, so use it if you wish).
There are probably some solutions to secure the keys even better, but these ones should do the job. I'm using other Microsoft 365 modules with similar approach (Outlook, SharePoint PnP) and it works quite well.
NOTE: If you need to use user account, instead of service principal, make sure that you have MultiFactor Authentication disabled on that account for that specific application.
The relevant documentation to this (https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal) states that admin APIs (i.e. those served via Get-PowerBiActivityEvent) do not currently support service principals. This means it's not currently possible to use a registered app to run these cmdlets unattended.
There is a feature request open to provide this at the moment: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/39641572-need-service-principle-support-for-admin-api
Related
We have a bunch of Windows server applications that currently handle secrets as follows; our apps are in C#.
We store them in settings files in code
We store them encrypted, using a certificate
The servers have this certificate with the private key, so they can decrypt the secret
We're looking at implementing Hashicorp Vault. It seems easy enough to simply replace the encrypt-store-decrypt with storing the secret in Vault in the KV engine, and just grabbing it in our apps - that takes that certificate out of the picture entirely. Since we're on-prem, I'll need to figure out our auth method.
We have different apps running on different machines, and it's somewhat dynamic (not as much as an autoscaling scenario, but not permanent - so we can't just assign servers to roles one time and depend on Kerberos auth).
I'm unsure how to make AppRole work in our scenario. We don't have one of the example "trusted platforms" or "trusted entities", there's no Nomad, Chef, Terraform, etc. We have Windows machines, in a domain, and we have a homegrown orchestrator that could be queried to say "This machine name runs these apps", so maybe there's something that can be done there?
Am I in "write your own auth plugin" territory, to speak to our homegrown orchestrator?
Edit - someone on Reddit suggested that this is a simple solution if our apps are all 1-to-1 with the Windows domain account they run under, because then we can just use kerb authentication. That's not currently the way we're architected, but we've got to solve this somehow, and that might do it nicely.
2nd edit - replaced "services" with "apps", since most of our services aren't actually running as Windows services, just processes. The launcher is a Windows service but the individual processes it launches are not.
How about Group Managed Service Accounts?
https://learn.microsoft.com/en-us/windows-server/security/group-managed-service-accounts/group-managed-service-accounts-overview
Essentially you created one "trusted platform" (to your key vault service).
Your service can still has its own identity but delegation to the gMSA when you want to retrieve the secrets.
For future visibility, here's what we landed on:
TLS certificate authentication. Using Vault, we issue a handful of certs, each will correspond to a security policy/profile, so that any machine that holds that certificate will be able to authenticate and retrieve the secrets they should have access to.
Kerberos ended up being a dead-end for two reasons. The vault.exe agent (which is part of this use case) can't use the native Windows Kerberos SSPI, so we'd have to manage and distribute keytab files. Also, if we used machine authentication, it would blow up our client count (we're using the cloud-hosted HCP Vault, where pricing is partially based on client count).
Custom plugins can't be loaded into the HCP, of course
Azure won't work, it requires Managed Identities which you can't assign to on-prem machines. Otherwise this might have been a great fit
So I can successfully run commands to manage our Microsoft 365/AzureAd/Exchange Online - this involves assigning and removing license, converting user to a shared mailbox, delegating access to a mailbox, etc. I followed the guide here for authentication. But that's me actually logging in with my credentials + MFA (Multi-factor authentication) for authentication.
I want to have a script that does these type of actions triggered by a schedule. I believe I can include the credentials but how to do MFA? Tried to follow this but getting error clientid is not a guid I have registered an app in https://portal.azure.com/ and able to do Graph API calls using that. No luck in PowerShell authentication though. Any thoughts? Thanks!
Maybe try this? It should allow you to connect to all Microsoft online services and includes support for MFA. If it does not work, the website has many other scripts you can try
This is not possible. A potential solution is to set some rules where in specific case, MFA will not be required.
I've tried searching for a solution to this for half a day and decided I need to ask the question myself.
My goal is to remotely control standalone Windows servers via PowerShell on our internal network. Our environment is based on MicroFocus eDirectory instead of MS Active Directory so our servers are not connected via GPOs.
Since PowerShell have existed for such a long time and you can control commandline only installs via this I assumed there would be a solution in a form of certificate authentication client to server but I've yet to find anything resembling this.
I'm aware of workarounds including creating private keys to store decrypt in scripts but we do not want to risk losing such information and wish to be able to create certificates to both new clients and servers without having to include any form of credentials in scripts.
Is there no way to simply use certificates to authenticate in place of credentials?
You should take a look at this chapter in PS remoting book, it describes exactly what you need. https://devopscollective.gitbooks.io/secrets-of-powershell-remoting/content/manuscript/accessing-remote-computers.html
The certificate auth part starts in the middle somewhere, look for "Certificate Authentication"
To authenticate to Azure and use the Azure Resource Manager cmdlets, I currently use the methods outlined here, namely using an Azure Active Directory account, encrypting the password, storing the encrypted string in a text file, and reading that into a credential object when using it in the script.
But I get the sense that maybe I should be using management certificates instead.
There is a documented method to use a publish settings file, but apparently that doesn't work for AzureRm cmdlets, only older cmdlets.
I have seen examples for using Powershell to create an application_id and service principal, and for authenticating a C# app, for instance, but I can't seem to find anything showing how to use management certificates for authentication in a powershell script, to use AzureRm cmdlets.
Maybe the secure string password storage method is the right one. But I don't have a good sense for that.
What do you use?
The best way to do it? It depends what is important to you. Ease of use, security, scripting?
Microsoft Azure Tooling (Visual Studio, Azure Powershell and CLI) lately moved to a more fine-granular, role-based access control approach based on Azure AD. This is currently a pretty good way to do it, since Management certificates allow owners to manage at subscription level and have proven to be rather difficult in environments like Azure Automation.
Refs
https://azure.microsoft.com/de-de/blog/azure-automation-authenticating-to-azure-using-azure-active-directory/
https://azure.microsoft.com/en-us/documentation/articles/cloud-services-certs-create/#what-are-management-certificates
http://blogs.msdn.com/b/cloud_solution_architect/archive/2015/03/17/rbac-and-the-azure-resource-manager.aspx
https://azure.microsoft.com/en-us/documentation/articles/role-based-access-control-configure/
https://azure.microsoft.com/en-us/documentation/articles/resource-group-rbac/#concepts
You should have a look to Service Principal
https://azure.microsoft.com/en-us/documentation/articles/resource-group-create-service-principal-portal/
https://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/
I'm trying to run Powershell scripts in my Team City build steps.
The scripts use WebClient to connect to Team City's REST API; currently, I have to login to Team City and hardcode a username and password as arguments in my Powershell build step.
I'm wondering if anyone knows a way to pass the credentials I am currently using to authenticate to Team City in my Powershell scripts without hardcoding any passwords
If you only need read access in the REST api (ie you don't want to do POST/PUT/DELETE, only GET) then use the teamcity generated user name and password.
This username/password pair is generated per each build and valid only during the build run. This is how you can access them in your powershell script:
read the $env:TEAMCITY_BUILD_PROPERTIES_FILE environment variable which holds the full path to the build properties file that are generated/valid for this build
this file is a simple key=value java prop file. You need to parse out the values for teamcity.auth.userId and teamcity.auth.password properties. Or better yet, parse all the props always in your script init phase and put them into a hash table in your powershell script.
If you need write access to the REST api, you can't use this uid/pwd pair. For this I am using a keychain on osx and a keepass db on windows. Keepass has a nice .net api that you can access from powershell. Create an new keepass db, make it unlockable with a key, not with a password, make sure your user running the build agent has access to this key and no one else, then use keepass api to unlock the db, read out your teamcity admin account and password who can do POST/PUT/DELETE in the rest api.
Thanks for the answer but we wound up providing the username and password as build parameters.
TeamCity's built in password protection helped us out here.
In this way, we're using one account to run our powershell scripts but we can still see who kicked off the build from the credentials they used to login to the web UI.
So we maintained traceable responsibility and stopped the constant entering of username and passwords.
More info: confluence.jetbrains.com/display/TCD7/Typed+Parameters