I'm having a lot of trouble trying to get a PowerShell Desired State Configuration script working to configure an in-house application. The root of the problem is that I can't seem to pass my configuration data down into a ScriptResource (at least not with the way I'm trying to do it).
My script is supposed to create a config folder for our in-house application, and then write some settings into a file:
configuration MyApp {
param (
[string[]] $ComputerName = $env:ComputerName
)
node $ComputerName {
File ConfigurationFolder {
Type = "Directory"
DestinationPath = $Node.ConfigFolder
Ensure = "Present"
}
Script ConfigurationFile {
SetScript = {
write-verbose "running ConfigurationFile.SetScript";
write-verbose "folder = $($Node.ConfigFolder)";
write-verbose "filename = $($Node.ConfigFile)";
[System.IO.File]::WriteAllText($Node.ConfigFile, "enabled=" + $Node.Enabled);
}
TestScript = {
write-verbose "running ConfigurationFile.TestScript";
write-verbose "folder = $($Node.ConfigFolder)";
write-verbose "filename = $($Node.ConfigFile)";
return (Test-Path $Node.ConfigFile);
}
GetScript = { #{Configured = (Test-Path $Node.ConfigFile)} }
DependsOn = "[File]ConfigurationFolder"
}
}
}
For reference, my configuration data looks like this:
$config = #{
AllNodes = #(
#{
NodeName = "*"
ConfigFolder = "C:\myapp\config"
ConfigFile = "C:\myapp\config\config.txt"
}
#{
NodeName = "ServerA"
Enabled = "true"
}
#{
NodeName = "ServerB"
Enabled = "false"
}
)
}
And I'm applying DSC with the following:
$mof = MyApp -ConfigurationData $config;
Start-DscConfiguration MyApp –Wait –Verbose;
When I apply this configuration it happily creates the folder, but fails to do anything with the config file. Looking at the output below, it's obvious that it's because the $Node variable is null inside the scope of ConfigurationFile / TestScript, but I've got no idea how to reference it from within that block.
LCM: [ Start Resource ] [[Script]ConfigurationFile]
LCM: [ Start Test ] [[Script]ConfigurationFile]
[[Script]ConfigurationFile] running ConfigurationFile.TestScript
[[Script]ConfigurationFile] node is null = True
[[Script]ConfigurationFile] folder =
[[Script]ConfigurationFile] filename =
LCM: [ End Test ] [[Script]ConfigurationFile] in 0.4850 seconds.
I've burnt off an entire day searching online for this specific problem, but all the examples of variables, parameters and configuration data all use File and Registry resources or other non-script resources, which I've already got working in the "ConfigurationFolder" block in my script. The thing I'm stuck on is how to reference the configuration data from within a Script resource like my "ConfigurationFile".
I've drawn a complete blank so any help would be greatly appreciated. If all else fails I may have to create a separate "configuration" script per server and hard-code the values, which I really don't want to do if at all possible.
Cheers,
Mike
Change this: $Node.ConfigFolder to $using:Node.ConfigFolder.
If you have a variable called $Foo and you want it to be passed to a script DSC resource, then use $using:Foo
Based on David's answer, I've written a utility function which converts my script block to a string and then performs a very naive search and replace to expand out references to the configuration data as follows.
function Format-DscScriptBlock()
{
param(
[parameter(Mandatory=$true)]
[System.Collections.Hashtable] $node,
[parameter(Mandatory=$true)]
[System.Management.Automation.ScriptBlock] $scriptBlock
)
$result = $scriptBlock.ToString();
foreach( $key in $node.Keys )
{
$result = $result.Replace("`$Node.$key", $node[$key]);
}
return $result;
}
My SetScript then becomes:
SetScript = Format-DscScriptBlock -Node $Node -ScriptBlock {
write-verbose "running ConfigurationFile.SetScript";
write-verbose "folder = $Node.ConfigFolder";
write-verbose "filename = $Node.ConfigFile)";
[System.IO.File]::WriteAllText("$Node.ConfigFile", "enabled=" + $Node.Enabled);
}
You have to be mindful of quotes and escapes in your configuration data because Format-DscScriptBlock only performs literal substitution, but this was good enough for my purposes.
A quite elegant way to solve this problem is to work with the regular {0} placeholders. By applying the -f operator the placeholders can be replaced with their actual values.
The only downside with this method is that you cannot use the curly braces { } for anything other than placeholders (i.e. say a hashtable or a for-loop), because the -f operator requires the braces to contain an integer.
Your code then looks like this:
SetScript = ({
Set-ItemProperty "IIS:\AppPools\{0}" "managedRuntimeVersion" "v4.0"
Set-ItemProperty "IIS:\AppPools\{0}" "managedPipelineMode" 1 # 0 = Integrated, 1 = Classic
} -f #($ApplicationPoolName))
Also, a good way to find out if you're doing it right is by simply viewing the generated .mof file with a text editor; if you look at the generated TestScript / GetScript / SetScript members, you'll see that the code fragment really is a string. The $placeholder values should already have been replaced there.
ConfigurationData only exists at the time the MOF files are compiled, not at runtime when the DSC engine applies your scripts. The SetScript, GetScript, and TestScript attributes of the Script resource are actually strings, not script blocks.
It's possible to generate those script strings (with all of the required data from your ConfigurationData already expanded), but you have to be careful to use escapes, subexpressions and quotation marks correctly.
I posted a brief example of this over on the original TechNet thread at http://social.technet.microsoft.com/Forums/en-US/2eb97d67-f1fb-4857-8840-de9c4cb9cae0/dsc-configuration-data-for-script-resources?forum=winserverpowershell
Related
TLDR
I'm trying to create a function that will take a Multi-Level [PSCustomObject], extract the Key/Value pairs (strings only), and use them to declare Individual Global Variables using Set-Variable.
Current Code
Set-Variable -Name 'NSOneDrive' -Value "D:\OneDrive - New Spectrum"
$StrykerDirs = [PSCustomObject]#{
'OneDrive' = [PSCustomObject]#{
'NSOneDrive' = "D:\OneDrive - New Spectrum"
'MyOneDrive' = "D:\OneDrive"
}
'Dev' = [PSCustomObject]#{
'DevDir' = "${NSOneDrive}\Dev"
'DevToolsDir' = [PSCustomObject]#{
'DevTools' = "${NSOneDrive}\Dev\_DevTools"
'Terminals' = [PSCustomObject]#{
'DT_Terminals' = "${NSOneDrive}\Dev\_DevTools\terminals"
'DT_PowerShell' = "${NSOneDrive}\Dev\_DevTools\terminals\PowerShell"
}
'Editors' = [PSCustomObject]#{
'DT_Editors' = "${NSOneDrive}\Dev\_DevTools\.editors"
}
}
'ProjectsDir' = [PSCustomObject]#{
'NSProjects' = "${NSOneDrive}\Projects\NewSpectrum"
'MyProjects' = "${NSOneDrive}\Projects\Personal"
}
}
}
$StrykerDirs |
ConvertTo-JSON -Depth 25 |
Tee-Object -FilePath ".\JSON\Stryker-Paths.json"
function Set-DirAliases {
[CmdletBinding()]
# I might add parameters after I know how to make the 'Process' work
Begin {
# Begin Process Block
}
Process {
ForEach ( $dir in $StrykerDirs ) {
where ( $_.GetType() -eq 'String' ) |
Set-Variable -Name "${key}" -Value "${value}"
# I know ${key} and ${value} won't work, but I'm not sure how to properly fill them
}
}
End {
# End Process Block
}
}
Goals
Simplifying Set-Location Navigation
First and foremost I obviously need to figure out how to make the above Process block work. Once I do, I'll be able to easily declare Directory Variables for use with Set-Location. This is only for streamlining variable declarations so I don't have to repeatedly declare them with a messy barrage of individual Set-Variable commands while also avoiding the use of long (sometimes very long) $Object.PropertyName 'variables'.
After I get a handle on this script, I'll be able to finish several other scripts and functions that use (more or less) the same basic process.
Add to $PROFILE
This particular script is going to be part of a 'Startups' section in my default $PROFILE (Microsoft.PowerShell_profile.ps1) so I can set the Directory Variables in-bulk and keep the $PROFILE script itself nice and clean.
The other scripts that I mentioned anbove are also going to be included in my $PROFILE Startups.
JSON Output
The script also exports a .json file so that, among other things, I can (hopefully) repeat the process down the road in my WSL Bash Profiles.
Param() Functionality
Eventually I want to add a Param() block so the function can be used outside of the script as well.
I'm trying to pass a variable to my DSC configuration, and have it modified inside it, then returned. To do so, I wanted to use a ref var.
Like so in my powershell script:
configuration My_DSC_config
{
param (
[ref]$data
)
Node $AllNodes.Where({$_.Role -eq "test"}).nodename
{
#some code....
$data.Value = 10
}
}
$myVar = 3
My_DSC_config -data ([ref]$myVar) -ConfigurationData <my_config> -OutputPath <mofPath>
Start-DscConfiguration -Path <mofPath> -credential <myCreds>
However, when I start the powershell script, i get an error :
"New-Object : The argument "2" must not be of type System.Management.Autmation.PSReference. Don't use [ref]."
Why so? And if using [ref] isn't possible, how can I get a 'global' variable to change inside of my configuration call?
I have one ps1 script that drives the operations I want to perform.
I am using modules with class definitions in the modules that use Command pattern.
All is well and good first time I open a powershell session console and run the script.
If I change a class in any way and re-run in the same console, the console does not seem to be picking up the changed script. I have to close the powershell console and run the script fresh in order for my changes work. Otherwise I just get the script behaving the same way it does before I made the change. Clearly there is some caching going on.
I am wondering if MS has finally resolved this issue. I have read many older posts with complaints about this.
I have tried the following and none of them appears to work:
Remove-Variable * -ErrorAction SilentlyContinue;
Remove-Module *;
$error.Clear();
Clear-Host
I have even tried all of them together. Still not helping.
Is there something else can can be done to ensure the latest code in any supporting modules gets loaded? Having to close the whole console and reload is a serious productivity issue.
Example of what I am doing:6
using module .\Logger.psm1
using module .\AzurePlatformParmsDefault.psm1
using module .\AzurePlatform.psm1
[Logger] $Logger = [Logger]::Create()
[AzurePlatformParms] $AzurePlatformParms = [AzurePlatformParmsDefault]::Create( $Logger )
[AzurePlatform] $AzurePlatform = [AzurePlatform]::Create( $Logger, $AzurePlatformParms )
[bool] $Result = $AzurePlatform.Execute()
The conventional wisdom is that there isn't a way to do this natively, and creating a new runspace or process is the solution.
You can reset variables to default values and import environment variables from the user/machine scope (on windows); before clearing any jobs, events, event subscribers etc. This isn't a true session refresh though, and classes/custom types will persist.
To speed up your workflow, you may want to use a function in your $profile that can automate creating a new session, and loading in what's needed. This approach can save enough time that it is trivial to recycle an interactive session. I will include the one I use in my profile as an example. It's fairly comprehensive, but I suggest tailoring one that is suitable for your specific needs.
Example
function Start-NewSession {
[CmdletBinding(DefaultParameterSetName = 'NoChange')]
[Alias('sans')]
param(
[Alias('N')]
[switch]
$NoClose,
[Parameter(ParameterSetName = 'Elevate')]
[Parameter(ParameterSetName = 'NoChange')]
[Alias('nop')]
[switch]
$NoProfile,
[Parameter(ParameterSetName = 'Elevate')]
[Parameter(ParameterSetName = 'NoChange')]
[Alias('A')]
[switch]
$AddCommands,
[Parameter(ParameterSetName = 'Elevate')]
[Alias('E')]
[switch]
$Elevate,
[Parameter(ParameterSetName = 'DeElevate')]
[Alias('D')]
[switch]
$DeElevate
)
$PSAppPath = (Get-Process -Id $PID).Path
$SPParams = #{
Filepath = $PSAppPath
WorkingDirectory = $PWD
ArgumentList = ''
}
if ($Elevate.IsPresent) {
$SPParams['Verb'] = 'RunAs'
}
elseif ($DeElevate.IsPresent) {
$SPParams['FilePath'] = Join-Path $env:windir 'explorer.exe'
$SPParams['ArgumentList'] = $PSAppPath
}
if ($NoProfile.IsPresent) {
$SPParams['ArgumentList'] += ' -NoProfile'
}
if ($AddCommands.IsPresent) {
$ExtraCmds = Read-Host -Prompt 'Post-startup commands'
if (-not [string]::IsNullOrWhiteSpace($ExtraCmds)) {
$SPParams['ArgumentList'] +=
' -NoExit -Command "' + $ExtraCmds.Replace('"', '\"') + '"'
}
}
if ([string]::IsNullOrWhiteSpace($SPParams['ArgumentList'])) {
$SPParams.Remove('ArgumentList')
}
Start-Process #SPParams
if (-not $NoClose.IsPresent) { exit }
}
This permits typing sans to generate a new session and close the old one.
I´m trying to reuse code on my SMA runbooks but everything I try to put inside a function doesn´t seem to work as expected.
For example, If I do this it works and returns the username of the credential:
workflow RB_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
Output:
But if I turn into this it doesn't work anymore (returns null):
workflow RB_Test
{
function FN_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
FN_Test
}
Output:
I've tried different things but without success. The debug/verbose screen don't return anything different. That also doesn't work:
Inlinescript {
. FN_Test
}
My goal would be to put several functions into a separate module and then import it on my runbooks for reusability but this really seems not to work.
This is a runbook (powershell workflow) created in the Service Management Automation (SMA).
I've read that there are some restrictions with Powershell workflow compared to pure Powershell but I am not sure if I am hitting one of them:
https://blogs.technet.microsoft.com/heyscriptingguy/2013/01/02/powershell-workflows-restrictions/
Thanks
Here's what I've had to do to get functions to work:
workflow FunctionTest {
function log {
param(
[string]$Message
)
Write-Output $Message
Write-Output "Filename: $Filename"
Write-Output "using:Filename: $using:Filename"
Write-Output "workflow:Filename: $workflow:Filename"
Write-Output "----"
## Under what conditions is 'global' used? Can't be used in a workflow...Hey Scripting Guy!
}
workflow DoSomething {
param(
[string]$Filename
)
log "Starting DoSomething"
}
$Filename = "LogFile_2017.csv"
log "Starting workflow"
## Variables need to be passed into workflow from parent-workflow
DoSomething -Filename $Filename
log "End workflow"
}
FunctionTest
I found you need to define your functions before using them. The tricky part was discovering that you have to pass your variables into the child-workflow.
The scope of the variables takes some getting used to.
In PowerShell I'm using Microsoft.SqlServer.Dac.DacServices and Microsoft.SqlServer.Dac.DacDeployOptions to deploy/update a database DACPAC. The problem I am having is finding where to set the SQLCMD Variables the package requires.
Abbreviated Sample
# Create a DacServices object, which needs a connection string
$dacsvcs = New-Object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"
# Load dacpac from file
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
# Deploy options
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
$deployOptions.IncludeCompositeObjects = $true
I know I can input these just fine with SqlPackage.exe, and maybe that's what I should do. But no where in the documentation or web grok can I find an example of DacServices usage with SQLCMD variables as an option--SQLCMD variables as required parameters for my project's DACPAC.
You should set options in the $deployOptions.SqlCommandVariableValues property. This is an updateabase Dictionary - you can't assign a new dictionary but you can update the key/value pairs inside it. For example to set a variable "MyDatabaseRef" to "Database123" use
$deployOptions.SqlCommandVariableValues.Add("MyDatabaseRef", "Database123");
The API reference is here.
I have another code snippet to share in relation to this, a method of processing multiple variables from a Powershell script argument;
param(
[hashtable] $SqlCmdVar
)
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
# Process the Sql Command Variables
#
if ($SqlCmdVar -ne $null)
{
foreach($key in $SqlCmdVar.keys)
{
Write-Verbose -Message "Adding Sql Command Variable ""$key""..."
$deployOptions.SqlCommandVariableValues.Add($key,$SqlCmdVar[$key])
}
}
You would call the script like this;
myscript.ps1 -SqlCmdVar #{ variable1 = "my first value"; variable2 = "my second value"; variableetc = "more values"}