How to pass parameters to PowerShell script from TFS build process? - powershell

I created a little PowerShell script to change connection string in my web.config.
param([string]$webConfigPath, [string]$connectionStringName, [string]$connectionStringValue)
# get the full path of the web config file
$webConfigFile = [IO.Path]::Combine($webConfigPath, 'Web.config')
# load the XML
$webConfig = [xml](cat $webConfigFile)
#change the appropriate config
$webConfig.configuration.connectionStrings.add | foreach {
if($_.name -eq $connectionStringName){
$_.connectionString = $connectionStringValue
}
}
#save the file
$webConfig.Save($webConfigFile)
I added it to my build process. How to pass the build's variables to the script?
(I use the new script based build process, so I only have a builtin "Arguments" field for the parameter)

You can put all parameters in a single line into Arguments files like this:
-webConfigPath "c:\web.config" -connectionStringName "My connection string"

Related

Make a .json-file more flexible with Variables for automation deploy

I've got a PowerShell-Script to create a VM from an Image in Azure and in this Script I deposited a .json (Parameter for VM, etc.). But if I want to create more than one VM the Names of the VM, Vnet, etc. cannot be the same for every execution (have to be in the same Resource Group).
So my Question: How can I insert Variables in the .json File to change the Name of the VM, etc. for every execution? Perhaps I have to rethink?
A very basic approach could be something like this:
# Grab the file contents
$contents = Get-Content -Path $templateFile
# Update some tokens in the file contents
$contents = $contents.replace("original value", "new value")
# Push the updated contents to a new file
Set-Content -Path $updatedFile -Value $contents
If you have a value that changes with every deployment, you could also consider using the -TemplateParameterObject parameter with the New-AzureRmResourceGroupDeployment cmdlet. That way, you can generate the values in your powershell script without having to output them to json file first.
For more details, have a look at the cmdlet specs

Prompt user for input and save that input for future iterations of the script

I built a script that runs on the users PC every minute via a scheduled task. The scheduled task gets created by a batch script that initially runs the PowerShell script but also schedules the task to run every minute.
I want the PowerShell script to prompt the user for certain variables (username, email address) and remember these variables each time it runs automatically until the user next manually runs the script.
I know I can do this:
$email= Read-Host 'What is your email address?'
But how do I make it save the input until it is next opened manually?
One idea I had was to have a batch script that schedules a task to run a batch script every minute. The batch script would then run the PowerShell script every time it is run and automatically and silently respond to the questions asked based on how the user edits the batch script. There has to be a better way than this.
You could do something like this, which would store the configuration in a file as XML under the users profile, unless that file already existed in which case it would load it:
$ConfigPath = "$env:userprofile\userconfig.xml"
If (test-path $ConfigPath){
$Config = Import-Clixml $ConfigPath
}Else{
$ConfigHash = #{
email = read-host "What is your email address?"
username = read-host "What is your username?"
}
$Config = New-Object -TypeName PSObject -Property $ConfigHash
$Config | export-clixml $ConfigPath
}
You then access the configuration settings as follows:
$Config.email
$Config.username
Explanation
Defines a path to store the config (which uses the environment variable UserProfile)
Uses Test-Path to check if the file exists
If it does exist, uses the Import-CliXML cmdlet to load the file in to a PowerShell object in the variable $Config.
If it does not exist, creates a hashtable which prompts the user for each configuration setting
Uses New-Object to turn that Hashtable in to a PowerShell Object which is stored in $Config
Writes $Config out to an XML file using Export-CliXML and stores it under the path defined in $ConfigPath.

PowerShell: provide parameters in a file

Is there a way to provide powershell parameters with a file?
At the moment I have a script which is called My_Script.ps1. To start this script I have to provide the right parameters in the command:
.\My_Script.ps1 -param1="x" -param2="x" -param3="x" -param4="x" -param5="x" -param6="x" ...
This works but it isn't a very easy way to start the script. Is it possible in powershell to use a file in which you store your parameters and to use that file when you start the script?
Example
In My_Script.ps1 I add something like:
Param(
[string]$File="Path/to/file"
)
In my file I have something like
param1="x"
param2="x"
param3="x"
param4="x"
...
To execute the script you can edit the file and just start the script with .\My_Script.ps1
Another option:
Just use a ps1 file as config file and define your variables as you would do in your main script
$Param1 = "Value"
$Param2 = 42
Then you can use dot-sourcing or import-module to get the data from the config file
. .\configfile.ps1
or
Import-Module .\Configfile.ps1
afterwards you can just use the variables
In addition to splatting you can create variables from = separated values in a file.
param1=foo
param2=bar
param3=herp
param4=derp
Don't quote the values. The parameter names should be valid for a variable (no spaces etc.)
PowerShell 3 and newer:
(Get-Content c:\params.ini -raw | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
PowerShell 2:
([IO.File]::ReadAllText('c:\params.ini') | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
The code creates variables in current scope. It's possible to create in a global/script/parent scope.
You can use this blog posting
for a start and declare your parameters in an ini-like format.
For sure you could also use a csv-like format and work with import-csv cmdlet.

VSTS Build: Replacing token with build number

I'm trying to update a web.config app setting called 'AppVersion' with the build number when building my application in VSTS.
Here are my build steps:
The 'Replace tokens' step converts any variables you set for your build and replaces the tokens you've set in your config files. This part works but what it won't do is get an environment variable like the build number and do a replace. It will just replace whatever text has been specified. Here's my build variables:
So after the build step is completed, my app setting is...
<add key="AppVersion" value="$(BuildNumber)" />
when it should be something like...
<add key="AppVersion" value="20160520.1" />
Can anyone point me in the right direction?
Many thanks.
I did something similar using the "Replace Tokens in **/*config" task.
To update the value for the key "AppVersion" with the current build number, your line should look like the following,
<add key="AppVersion" value="#{Build.BuildNumber}#" />
You can add a PowerShell script task before "Replace Token" task to pass the "BuildNumber" to "AppVersion" variable as following.
In VSTS, use $(Build.BuildNumber) as specified in this doc.
Note that you cannot use $(Build.BuildNumber) to set a variable's value, because it is taken literally; it should be an argument to the task. If your task does not accept it, you can replace with a little Powershell script and the BUILD_BUILDNUMBER environment variable.
param (
[Parameter(Mandatory = $true)]
[String]$fileWithTokens,
[Parameter(Mandatory = $false)]
[String]$tokenRegex = "__(\w+)__"
)
$vars = Get-ChildItem -path env:*
$contents = Get-Content -Path $fileWithTokens
$newContents = "";
$contents | % {
$line = $_
if ($_ -match $tokenRegex) {
$setting = Get-ChildItem -path env:* | ? { $_.Name -eq $Matches[1] }
if ($setting) {
Write-Host ("Replacing key {0} with value from environment" -f $setting.Name)
$line = $_ -replace $tokenRegex, $setting.Value
}
}
$newContents += $line + [Environment]::NewLine
}
Set-Content $fileWithTokens -Value $newContents
```
Source https://github.com/colindembovsky/cols-agent-tasks/tree/master/Tasks/ReplaceTokens
After a day of research, finally found/created a better option than using any random app (Replace Token) from Marketplace.
The option I am talking is already available in VSTS, Azure CLI task.
Here are the stpes:
Add setting BUILD_NUMBER with initial value of 1.0 in appsettings.json
Read appsettings.json in your app and display it. I am sure you all are smart enough to figure out how to use appsettings to display Build Number on your WebApplication.
In Azure Portal, similarly create an App Setting named BUILD_NUMBER with initial value of 1.0 in Azure Application settings section under App Services for your App.
In VSTS, In your Release definition, add a task Azure CLI.
Populate required fields such as Azure Subscription, Script Location with Inline script and last but most important Inline Script with following CLI command
az webapp config appsettings set -n iCoreTestApi -g ArchitectsSandbox -s Dev --settings BUILD_NUMBER=$(Build.BuildNumber)
Command explanation:
iCoreTestApi should be replaced by your real WebApp or Api name in Azure
ArchitectsSandbox should be replaced by your resource group in Azure
Dev is the slot name, you may or may not have it.
Rest of the command remains same.
Once you will queue new build, after successful completion of the deployment, you can see app settings section on Azure is updated with new BUILD_NUMBER.
Let me know if you still have any question.

How to perform Import-Module in powershell script used in File Server Resource Manager Classification script

I am trying to use the active directory PowerShell module inside a classification rule in File server resource manager on windows server 2012 R2.
When I try to just perform:
Import-Module ActiveDirectory
It will crash (I assume) and not update the classification property anymore.
I tried setting the script parameter -ExecutionPolicy Unrestricted, but that didn't help.
Anyone know how to get it to work?
non working code:
# Global variables available:
# $ModuleDefinition (IFsrmPipelineModuleDefinition)
# $Rule (IFsrmClassificationRule)
# $PropertyDefinition (IFsrmPropertyDefinition)
#
# And (optionally) any parameters you provide in the Script parameters box below,
# i.e. "$a = 1; $b = 2" . The string you enter is treated as a script and executed so the
# variables you define become globally available
# optional function to specify when the behavior of this script was last modified
# if it consumes additional files. emit one value of type DateTime
#
# function LastModified
# {
# }
# required function that outputs a value to be assigned to the specified property for each file classified
# emitting no value is allowed, which causes no value to be assigned for the property
# emitting more than one value will result in errors during classification
# begin and end are optional; process is required
#
function GetPropertyValueToApply
{
# this parameter is of type IFsrmPropertyBag
# it also has an additional method, GetStream, which returns a IO.Stream object to use for
# reading the contents of the file. Make sure to close the stream after you are done reading
# from the file
param
(
[Parameter(Position = 0)] $PropertyBag
)
process
{
Import-Module activedirectory
$users = Get-ADUser -filter * -Properties SID,Department
return "dummy result";
}
}
As note: This works perfectly fine in a PowerShell console; that isn't the issue. It's running the code as classifier for the file server resource manager.
Worked my way around it now by just creating a CSV file with the result of the Get-ADUser and loading that inside the script for now (so I don't require any non standard modules). But it would be nicer to just run this without a dependency on some external task.
The classification script is executed from a the File Server Resource Manager Service (not from the UI you are looking at), which is running under the system account.
So you either need to modify under which account the service is running or give the account rights to access the objects you require to access. In my case Active Directory.