I'm trying to run Powershell scripts in my Team City build steps.
The scripts use WebClient to connect to Team City's REST API; currently, I have to login to Team City and hardcode a username and password as arguments in my Powershell build step.
I'm wondering if anyone knows a way to pass the credentials I am currently using to authenticate to Team City in my Powershell scripts without hardcoding any passwords
If you only need read access in the REST api (ie you don't want to do POST/PUT/DELETE, only GET) then use the teamcity generated user name and password.
This username/password pair is generated per each build and valid only during the build run. This is how you can access them in your powershell script:
read the $env:TEAMCITY_BUILD_PROPERTIES_FILE environment variable which holds the full path to the build properties file that are generated/valid for this build
this file is a simple key=value java prop file. You need to parse out the values for teamcity.auth.userId and teamcity.auth.password properties. Or better yet, parse all the props always in your script init phase and put them into a hash table in your powershell script.
If you need write access to the REST api, you can't use this uid/pwd pair. For this I am using a keychain on osx and a keepass db on windows. Keepass has a nice .net api that you can access from powershell. Create an new keepass db, make it unlockable with a key, not with a password, make sure your user running the build agent has access to this key and no one else, then use keepass api to unlock the db, read out your teamcity admin account and password who can do POST/PUT/DELETE in the rest api.
Thanks for the answer but we wound up providing the username and password as build parameters.
TeamCity's built in password protection helped us out here.
In this way, we're using one account to run our powershell scripts but we can still see who kicked off the build from the credentials they used to login to the web UI.
So we maintained traceable responsibility and stopped the constant entering of username and passwords.
More info: confluence.jetbrains.com/display/TCD7/Typed+Parameters
Related
I need to migrate to version 7 a PowerShell script that was used to create an Azure AD application with a custom password using New-AzADApplication
cmdlet. However, with PowerShell version 7, New-AzADApplication no longer supports custom passwords. I've tried retrieving the automatically generated password from the output of this cmdlet but the credentials field in it is always empty. Is there a way to retrieve the password with any cmdlet in the Az.Resources module?
Post creating the app registration you cannot retrieve the value of created Client Secret in any way.
As shown in the below image:
You can refer this for more information about secretText.
We are trying to connect to Microsoft SQL Server installed in an Azure VM (IaaS) from Datastage using API.
Currently, we are using JDBC connector to connect to Microsoft SQL Server (IaaS) using a service account and its password. But, on a new server, we have to reset the password every three months in Azure. Also, same service account is being used by other applications.
We have to create the change request to reflect the new password in the datastage PROD environment. Also, we are getting separated service account to use in Datastage.
To avoid the password reset or lock issue, we are planning use API to get the password for connecting to the DB.
API DB connection is working in Alteryx. Can you please let us know is it possible to connect and ways to do so in Datastage 11.7.1.2. Also, please let me know any other feasible solution for this problem, if the API connection is not possible.
I assume you know how to fetch the password via command line interface from your cloud service.
Store the password as datastage environment variable which is then used in the job.
Use a shell script to update the password. In the script, check first if the password has changed. If it did, run the dsadmin -envset command to set the environment variable to a new value. You might need to encrypt the new value using the encrypt command located in .../ASBNode/bin. Call the script every time before running the parallel job.
You should test if the change of an environment variable will be recognized by the job just in time when the script and the job are called by the same sequence. It might not work if the param is passed-through by the sequence.
Please read the IBM docs about the commands I mentioned.
We have a PowerShell script to pull Power BI activity data (using Get-PowerBIActivityEvent), and I have been trying to automate it so that it can pull this data daily using an unattended account. The problem is the script must necessarily use the Connect-PowerBIServiceAccount cmdlet, which requires a credential. I don't want to have the passwords hard-coded anywhere (obviously) and ideally don't want to be passing it into the script as a plaintext parameter in case of memory leaks.
I've tried using SSIS as a scheduling mechanism since it allows for encrypted parameters in script tasks, but can't call the PS script with a SecureString parameter since the System.Management.Automation namespace isn't in the GAC (a commandline call wouldn't be possible).
I don't believe task scheduler would offer the functionality needed.
Does anyone know of any elegant ways to connect to the power BI service using encrypted credentials?
In the docs of Connect-PowerBIServiceAccount there are 2 options for unattended sign-in:
Using -Credential, where you pass AAD client ID as username and application secret key as password
Using -CertificateThumbprint and -ApplicationId
For both options you need to configure service pricipal and add proper permissions. I'm not going into details how to configure that, but most probably you'd need (at least) the following application permissions:
I'm not really sure what functionalities you need in the script, but in my experience, majority of the cases can be covered by scheduled task, so the explanation below will apply to that solution.
How you can secure the credentials?
There are variuos possible solutions, depending on your preferences. I'd consider certificate-based authentication as more secure (certificate is available only to current user/all users of the machine).
What's important in certificate-based authentication - make sure that the certificate is available for the account running the script (in many cases it's service account, not your user account).
How can I secure more?
If you want, you can store application ID as secure string (I don't have SSIS to test, so I'm not sure if there's any workaround to make it working in there) or use Export-CliXml. They use Windows Data Protection API (DPAPI), so the file can be decrypted only by the account which was used to encrypt.
To add one more level of security (I'm not even mentioning setting correct access rights to the files as it's obvious) you might put the file in the folder encrypted (you might already have a solution for disk encryption, so use it if you wish).
There are probably some solutions to secure the keys even better, but these ones should do the job. I'm using other Microsoft 365 modules with similar approach (Outlook, SharePoint PnP) and it works quite well.
NOTE: If you need to use user account, instead of service principal, make sure that you have MultiFactor Authentication disabled on that account for that specific application.
The relevant documentation to this (https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal) states that admin APIs (i.e. those served via Get-PowerBiActivityEvent) do not currently support service principals. This means it's not currently possible to use a registered app to run these cmdlets unattended.
There is a feature request open to provide this at the moment: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/39641572-need-service-principle-support-for-admin-api
To authenticate to Azure and use the Azure Resource Manager cmdlets, I currently use the methods outlined here, namely using an Azure Active Directory account, encrypting the password, storing the encrypted string in a text file, and reading that into a credential object when using it in the script.
But I get the sense that maybe I should be using management certificates instead.
There is a documented method to use a publish settings file, but apparently that doesn't work for AzureRm cmdlets, only older cmdlets.
I have seen examples for using Powershell to create an application_id and service principal, and for authenticating a C# app, for instance, but I can't seem to find anything showing how to use management certificates for authentication in a powershell script, to use AzureRm cmdlets.
Maybe the secure string password storage method is the right one. But I don't have a good sense for that.
What do you use?
The best way to do it? It depends what is important to you. Ease of use, security, scripting?
Microsoft Azure Tooling (Visual Studio, Azure Powershell and CLI) lately moved to a more fine-granular, role-based access control approach based on Azure AD. This is currently a pretty good way to do it, since Management certificates allow owners to manage at subscription level and have proven to be rather difficult in environments like Azure Automation.
Refs
https://azure.microsoft.com/de-de/blog/azure-automation-authenticating-to-azure-using-azure-active-directory/
https://azure.microsoft.com/en-us/documentation/articles/cloud-services-certs-create/#what-are-management-certificates
http://blogs.msdn.com/b/cloud_solution_architect/archive/2015/03/17/rbac-and-the-azure-resource-manager.aspx
https://azure.microsoft.com/en-us/documentation/articles/role-based-access-control-configure/
https://azure.microsoft.com/en-us/documentation/articles/resource-group-rbac/#concepts
You should have a look to Service Principal
https://azure.microsoft.com/en-us/documentation/articles/resource-group-create-service-principal-portal/
https://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/
We are creating reports and try to deliver them using a shared folder adding a subscription for that, but when we created it as for user credentials and we can add them for test proposes but the environment on prod, for security reasons we cannot put the credentials on it because IT uses all with windows authentication for security reasons.
Is it any way to set a windows user to deliver the file on the shared folder without add it when is configured?
The parameters for the database and the database server can be set in the data source as an expression, however you eiteher need to define a "ReportAdmin" user to always publish the reports with. The publisher should know the credentials prior to deployment. Other than that the data source will balk and inform that it needs credentials to run the operations :(