Passing secure parameter to PowerShell DSC script from Bicep template - powershell

I am using a Bicep template to deploy a virtual machine with a PowerShell DSC script that adds a Log Analytics workspace to the Log Analytics agent. The script uses a secure parameter (workspaceKey1) which is defined in the template and pulled from a key vault. When running the script, I'm getting the following error:
" Performing the operation "Set-TargetResource" on target "Executing the SetScript with the user supplied credential"."},
PowerShell DSC resource MSFT_ScriptResource failed to execute Set-TargetResource functionality with error message: Value does not fall within the expected range."
A simplified version of the script is below.
Configuration MmaMultihoming
{
Param (
[string] $workspaceId1,
[System.Management.Automation.PSCredential] $workspaceKey1,
)
Import-DscResource -ModuleName PSDesiredStateConfiguration;
Import-DscResource -ModuleName xPSDesiredStateConfiguration;
[System.Management.Automation.PSCredential]$workspaceKey1 = New-Object System.Management.Automation.PSCredential ($workspaceKey1.userName, $workspaceKey1.password)
Node localhost {
Script ConfigureWorkspace
{
SetScript =
{
$workspaceId = $Using:workspaceId1;
$workspaceKey = $Using:workspaceKey1;
$mma = New-Object -ComObject 'AgentConfigManager.MgmtSvcCfg';
$mma.AddCloudWorkspace($workspaceId, $workspaceKey);
$mma.ReloadConfiguration();
}
TestScript = { Test-Path "HKLM:\SYSTEM\ControlSet001\Services\HealthService\Parameters\Service Connector Services\Log Analytics - $($Using:workspaceId)"}
GetScript = { #{ Result = (Get-ChildItem "HKLM:\SYSTEM\ControlSet001\Services\HealthService\Parameters\Service Connector Services")} }
}
}
}
The Bicep code is below:
param keyvaultName string = 'keyvault'
param kvResourceGroup string = 'keyvault-rg'
param workspaceId1 string
#secure()
param workspaceKey1 string = kv.getSecret('workspaceKey1')
resource kv 'Microsoft.KeyVault/vaults#2022-07-01' existing = {
name: keyvaultName
scope: resourceGroup(kvResourceGroup )
}
resource DSC_LogAnalytics 'Microsoft.Compute/virtualMachines/extensions#2022-08-01' = {
parent: vm1
name: 'Microsoft.Powershell.DSC'
location: location
properties: {
publisher: 'Microsoft.PowerShell'
type: 'DSC'
typeHandlerVersion: '2.77'
autoUpgradeMinorVersion: true
settings: {
wmfVersion: 'latest'
configuration: {
url: dscScript
script: 'dsc.ps1'
function: 'MmaMultihoming'
}
configurationArguments: {
workspaceId1: workspaceId1
}
privacy: {
datacollection: 'enable'
}
advancedOptions: {
forcePullAndApply: false
}
}
protectedSettings: {
configurationArguments: {
workspaceKey1: {
userName: 'donotuse'
password: workspaceKey1
}
}
configurationUrlSasToken: SaaStoken
}
}
}
I am really quite stuck so would appreciate any ideas people can give.
Thanks!

Related

Ansible custom module (powershell): access environment variable

How can I access an environment variable in powershell when building a custom module?
- name: custom module
environment:
MY_ENVIRONMENT: "custom"
custom_action:
custom_param: "param"
register: result
With the following
Function Get-CommonSpec
{
#{
options = #{
MY_ENVIRONMENT = #{type = "str"; required = $true; }
}
}
}
I get 'missing required arguments USER' error

Resource not found error in cross subscription resources in Bicep

I am trying to create a private endpoint in one subscription (say xxxx) and my Vnet is in another subscription (say YYYYY). both are managed under a management group. So, i am deploying at management group level. But while creating the endpoint it is giving error that Resource is not found. Please suggest how to solve this issue.
Below is my code for main file:
targetScope = 'managementGroup'
param env string = 'xxxxx'
param appname string = 'abcd'
param tags object
param strgSKU string
param strgKind string
//variables
var envfullname = ((env == 'PrePrd') ? 'preprod' : ((env == 'Prd') ? 'prod' : ((env == 'SB') ? 'sb' : 'dev')))
var strgActName = toLower('${envfullname}${appname}sa1')
var saPrvtEndptName = '${envfullname}-${appname}-sa-pe1'
resource RG 'Microsoft.Resources/resourceGroups#2021-04-01' existing = {
scope:subscription('xxxxxxxxxxxxxxxxxxxxx')
name: '${env}-${appname}-RG'
}
resource vnet 'Microsoft.Network/virtualNetworks#2021-08-01' existing = {
scope: resourceGroup('yyyyyyyyyyyyyyyyy','Networking_RG')
name: 'Vnet1'
}
resource linkSubnet 'Microsoft.Network/virtualNetworks/subnets#2021-08-01' existing = {
scope: resourceGroup('yyyyyyyyyyyyyyy','Networking_RG')
name: 'Vnet1/subnet1'
}
var location = RG.location
var vnetid = vnet.id
//Deploy Resources
/////////////// STORAGE ACCOUNT///////////////////////////////////
//call storage Account bicep module to deploy the storage account
module storageAct './modules/storageAccount.bicep' = {
scope:RG
name: strgActName
params:{
strgActName: strgActName
location: location
tags: tags
sku: strgSKU
kind: strgKind
}
}
// Create a private endpoint and link to storage Account
module saPrivateEndPoint './modules/privateEndpoint.bicep' = {
scope:RG
name: saPrvtEndptName
params: {
prvtEndpointName: saPrvtEndptName
prvtLinkServiceId: storageAct.outputs.saId
tags: tags
location: location
subnetId: linkSubnet.id
//ipaddress: privateDNSip
fqdn: '${strgActName}.blob.core.cloudapi.net'
groupId: 'blob'
}
dependsOn: [
storageAct
]
}
And my privateendpoint module file looks like:
param prvtEndpointName string
param prvtLinkServiceId string
param tags object
param location string
param subnetId string
//param ipaddress string
param fqdn string
param groupId string
resource privateEndpoint 'Microsoft.Network/privateEndpoints#2020-11-01' = {
name: prvtEndpointName
location: location
tags: tags
properties: {
privateLinkServiceConnections: [
{
name: '${prvtEndpointName}_cef3fd7f-f1d3-4970-ae54-497245676050'
properties: {
privateLinkServiceId: prvtLinkServiceId
groupIds: [
groupId
]
privateLinkServiceConnectionState: {
status: 'Approved'
description: 'Auto-Approved'
actionsRequired: 'None'
}
}
}
]
manualPrivateLinkServiceConnections: []
subnet: {
id: subnetId
}
customDnsConfigs: [
{
fqdn: fqdn
// ipAddresses: [
// ipaddress
// ]
}
]
}
}
Command to execute the script is:
az deployment mg create --location 'USEast2' --name 'dev2'--management-group-id xt74yryuihfjdnv --template-file main.bicep --parameters main.parameters.json
Looks like privateEndpoint should be created in same subscription where Vnet resides, however it can be used across subscription resources.
https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-overview#private-endpoint-properties
link for reference.

PowerShell script with parameter for Windows VM instance on Google Cloud Platform

I am trying to deploy Windows VM on Google Cloud through terraform. The VM is getting deployed and I am able to execute PowerShell scripts by using windows-startup-script-url.
With this approach, I can only use scripts which are already stored in Google Storage. If the script has parameters / variables, then how to pass that parameter, any clue !
provider "google" {
project = "my-project"
region = "my-location"
zone = "my-zone"
}
resource "google_compute_instance" "default" {
name = "my-name"
machine_type = "n1-standard-2"
zone = "my-zone"
boot_disk {
initialize_params {
image = "windows-cloud/windows-2019"
}
}
metadata {
windows-startup-script-url = "gs://<my-storage>/<my-script.ps1>"
}
network_interface {
network = "default"
access_config {
}
}
tags = ["http-server", "windows-server"]
}
resource "google_compute_firewall" "http-server" {
name = "default-allow-http"
network = "default"
allow {
protocol = "tcp"
ports = ["80"]
}
source_ranges = ["0.0.0.0/0"]
target_tags = ["http-server"]
}
resource "google_compute_firewall" "windows-server" {
name = "windows-server"
network = "default"
allow {
protocol = "tcp"
ports = ["3389"]
}
source_ranges = ["0.0.0.0/0"]
target_tags = ["windows-server"]
}
output "ip" {
value = "${google_compute_instance.default.network_interface.0.access_config.0.nat_ip}"
}
Terraform doesn't require startup scripts to be pulled from GCS buckets necessarily.
The example here shows:
}
metadata = {
foo = "bar"
}
metadata_startup_script = "echo hi > /test.txt"
service_account {
scopes = ["userinfo-email", "compute-ro", "storage-ro"]
}
}
More in Official docs for GCE and Powershell scripting here

Pass bool param to a powershell script in Jenkinsfile multiline parameterised pipeline

In my Jenkinsfile I have something like:
def addDollar(param) {
return "\$" + param
}
parameters {
booleanParam(
defaultValue: false,
name: 'FORCE_UPGRADE'
)
}
environment {
FORCE_UPGRADE = addDollar(params.FORCE_UPGRADE)
}
stages {
stage('Test') {
steps {
powershell script: ".\\test.ps1 -forceUpgrade ${env:FORCE_UPGRADE}"
}
}
stage('Test Multiline') {
steps {
powershell script: '''
.\\test.ps1 `
-forceUpgrade $env:FORCE_UPGRADE
'''
}
}
}
and the powershell script is
param (
[Parameter(Mandatory=$true)][boolean]$forceUpgrade=$false
)
if($forceUpgrade) {
Write-Host "Forcing upgrade"
}
This jenkins stage Test works as expected but Test Multiline erorrs with ParameterBindingArgumentTransformationException:
Cannot process argument transformation on parameter 'forceUpgrade'. Cannot convert value "System.String" to type "System.Boolean". Boolean parameters accept only Boolean values and
numbers, such as $True, $False, 1 or 0.
I get the same error if I run
.\test.ps1 -forceUpgrade false rather than
.\test.ps1 -forceUpgrade $false
Any ideas how to get the Jenkins Test Multiline stage working? I have a script where i need to pass a number of args, would be ideal to prevent horizontal scrolling using multiline powershell
Change the powershell input to a [string] instead of a [boolean]
[string]$forceUpgrade = $false
you will still be able to use the $forceUpgrade variable as a bool in the conditional in the powershell also.

DSC Script resource does not read environement variable

I have a DSC configuration which install nodejs, adds npm to environment Path variable and then installs a npm module.
xPackage InstallNodeJs {
Name = 'Node.js'
Path = "$env:SystemDrive\temp\node-v4.4.7-x64.msi"
ProductId = '8434AEA1-1294-47E3-9137-848F546CD824'
Arguments = "/quiet"
}
Environment AddEnvironmentPaths
{
Name = "Path"
Ensure = "Present"
Path = $true
Value = "$env:SystemDrive\ProgramData\npm"
}
Script UpgradeNpm {
SetScript = {
& npm install --global --production npm-windows-upgrade
& npm-windows-upgrade --npm-version 3.10.6
}
TestScript = {
$npmVersion = & npm -v
return $npmVersion -eq "3.10.6"
}
GetScript = {
return {#{Result = "UpgradeNpm"}}
}
}
Installing nodejs and adding npm to Path variable seems to be successful. Both nodejs and npm location are added to Path and I can use them both in powershell and cmd.
However, Script resource returns that 'npm' is not recognized as internal or external command ...
the same is for node which is used inside npm-windows-upgrade script file.
Do you know why Script resource cannot read newly added Path entires?
The Environment DSC resource implementation makes the change by updating the values stored in the registry (with the exception of variables targeting Process). Changes made to environment variables stored in the registry are not reflected in the current session (read once, on session start).
You can affect values stored in the current session by:
Using System.Environment.SetEnvironmentVariable ([System.Environment]::SetEnvironmentVariable)
Modifying $env:<VariableName>
Of those, only the first allows you to write a persistent change. The latter can be considered a volatile change.
It's an odd limitation of the resource, I've looked at this before and felt it a little lacking.
There's no dependency information in there, so you can't count on the Environment resource running before the Script resource. There's not enough information in your post to tell if that's the case for sure, but you should consider controlling it anyway:
xPackage InstallNodeJs {
Name = 'Node.js'
Path = "$env:SystemDrive\temp\node-v4.4.7-x64.msi"
ProductId = '8434AEA1-1294-47E3-9137-848F546CD824'
Arguments = "/quiet"
}
Environment AddEnvironmentPaths
{
Name = "Path"
Ensure = "Present"
Path = $true
Value = "$env:SystemDrive\ProgramData\npm"
DependsOn = '[xPackage]InstallNodeJs'
}
Script UpgradeNpm {
SetScript = {
& npm install --global --production npm-windows-upgrade
& npm-windows-upgrade --npm-version 3.10.6
}
TestScript = {
$npmVersion = & npm -v
return $npmVersion -eq "3.10.6"
}
GetScript = {
return {#{Result = "UpgradeNpm"}}
}
DependsOn = '[Environment]AddEnvironmentPaths'
}
Can you share which version of DSC you are using? You can get this by doing a $PSVersionTable on the PowerShell console. I am able to add to the PATH variable and use it in a script resource.
configuration NPMTest
{
Environment AddEnvironmentPaths
{
Name = 'Path'
Ensure = 'Present'
Path = $true
Value = "$env:SystemDrive\ProgramData\npm"
}
Script p
{
GetScript = {#{}}
TestScript = {return $false}
SetScript = {$a = & a.ps1 ; Write-Verbose $a -Verbose}
}
}
The script a.ps1 was executed fine even though I am not specifying the full path to the script.