Get-AzureRmRecoveryServicesBackupItem Returns Nothing When Items Exist - powershell

I'm experimenting with Azure Resource Manager backups and restores using Recovery Services vaults in "Pay-As-You-Go" subscriptions. I am using very simple code that as far as I can tell is right out of MS documentation. I can get a vault object and I can get a container, but when I attempt to get any items out of any container, Get-AzureRmRecoveryServicesBackupItem always returns nothing, even when I know there are items in there and the Azure RM GUI confirms this. "Returns nothing" means if I assign the result to a variable it is empty/null, and if I try to output to the console there is no output.
I have had the same result with two different subscriptions, with different vaults, vm's and backup items, and from two different computers. I have re-installed modules on one machine and installed modules from scratch on the other. Using the -Debug switch for the Get-AzureRmRecoveryServicesBackupItem command appears to show the items being returned, with an OK result and no errors. Powershell code appears below, and -Debug output appears after that. The kicker is, this exact code was working on 2/7/17, no kidding. I'm about out of ideas at this point so any help would be appreciated.
Update: We have found that when we use the -FriendlyName parameter for the Get-AzureRmRecoveryServicesBackupContainer the problem occurs, but if we pipe the output through | Where-Object { $_.FriendlyName -eq $vmName } the problem does not occur. We have opened a case with MSFT.
$vaultName = 'somevault'
$resourceGroupName = 'somerg'
$vmName = 'testvm'
$vault = Get-AzureRmRecoveryServicesVault -Name $vaultName -ResourceGroupName $resourceGroupName;
Set-AzureRmRecoveryServicesVaultContext -Vault $vault
$container = Get-AzureRmRecoveryServicesBackupContainer -ContainerType AzureVM –Status Registered -FriendlyName $vmName
#output of this cmdlet is empty
Get-AzureRmRecoveryServicesBackupItem –Container $container -Debug
Debug output:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
OK
Headers:
Pragma : no-cache
x-ms-request-id : xxx
x-ms-client-request-id : xxx
Strict-Transport-Security : max-age=31536000; includeSubDomains
x-ms-ratelimit-remaining-subscription-reads: 14930
x-ms-correlation-request-id : xxx
x-ms-routing-request-id : WESTUS:xxx
Cache-Control : no-cache
Date : Sat, 11 Feb 2017 19:39:07 GMT
Server : Microsoft-IIS/8.0
X-Powered-By : ASP.NET
Body:
{
"value": [
{
"id": "/Subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.RecoveryServices/vaults/vault273/backupFabrics/Azure/protectionContainers/IaasVM
Container;iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm/protectedItems/VM;iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"name": "iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers/protectedItems",
"properties": {
"friendlyName": "testvm",
"virtualMachineId": "/subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.Compute/virtualMachines/testvm",
"protectionStatus": "Healthy",
"protectionState": "Protected",
"healthStatus": "Passed",
"healthDetails": [
{
"code": 400239,
"title": "IaasVmHealthGreenDefault",
"message": "Backup pre-check status of this virtual machine is OK.",
"recommendations": []
}
],
"lastBackupStatus": "Completed",
"lastBackupTime": "2017-02-11T15:11:12.2071619Z",
"protectedItemDataId": "70368936803029",
"protectedItemType": "Microsoft.Compute/virtualMachines",
"backupManagementType": "AzureIaasVM",
"workloadType": "VM",
"containerName": "iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"sourceResourceId": "/subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.Compute/virtualMachines/testvm",
"policyId": "/Subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.RecoveryServices/vaults/vault273/backupPolicies/DailyPolicy",
"policyName": "DailyPolicy",
"lastRecoveryPoint": "2017-02-11T15:12:16.7410628Z"
}
}
]
}
DEBUG: AzureQoSEvent: CommandName - Get-AzureRmRecoveryServicesBackupItem; IsSuccess - True; Duration - 00:00:02.3527163; Exception - ;

Had the same issue (also on various PCs) and worked around it by passing a different parameters set. I'm not using the Get-AzureRmRecoveryServicesBackupContainer to get a reference to the container. Instead, I'm directly calling Get-AzureRmRecoveryServicesBackupItem with the container name. This only works when you pass the parameter -BackupManagementType AzureVM.
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionId <myguid>
Get-AzureRmRecoveryServicesVault -Name "myvaultname" | Set-AzureRmRecoveryServicesVaultContext
$item = Get-AzureRmRecoveryServicesBackupItem -BackupManagementType AzureVM -WorkloadType AzureVM -Name "myname"
$latestRecoveryPoint = Get-AzureRmRecoveryServicesBackupRecoveryPoint -Item $item | Sort-Object RecoveryPointTime -Descending | Select-Object -First 1
If you leave out -Name, you get a list returned with all names.

Related

How can I get the workflow definition of a logic app as JSON?

How can I get the workflow definition of a logic app as JSON?
I can create a new Logic App from a JSON definition file using the New-AzLogicApp command
But I can't see how to reverse the process, i.e. get the JSON definition of an existing Logic App.
I've tried the Get-AzLogicApp which returns a Workflow object.
But I'm blocked on the last step, i.e. from the Workflow back to an actual JSON file
If you want to get the definition of the logic app, try the command as below.
$logicapp = Get-AzResource -ResourceGroupName <ResourceGroupName> -ResourceType Microsoft.Logic/workflows -ResourceName "<logic app name>"
$logicapp.properties.definition | ConvertTo-Json
If you want to get it as a .json file, just change the second line as below.
$logicapp.properties.definition | ConvertTo-Json | Out-File "C:\Users\joyw\Desktop\logic.json"
Update:
You could specify the -Depth parameter of ConvertTo-Json with 3, if you want more levels of contained objects are included in the JSON representation, you can also specify it with other values.
-Depth
Specifies how many levels of contained objects are included in the JSON representation. The default value is 2.
$logicapp.properties.definition | ConvertTo-Json -Depth 3
You can use the REST API to get the details.
REST Api Documentation
Or can you try using the Get-AzLogicApp | ConvertFrom-Json | ConvertTo-Json and see if that helps
I've drilled down for a project in a client u need to do this:
1 )Get-AzLogicApp
$lapp.Definition.ToString()-> this is the entire definition of the logicapp
2) Save the definition to a file
3) Use New-AzLogicApp or Set-AzLogicApp with -DefinitionFilePath pointing to that file
$a= New-AzLogicApp -ResourceGroupName $rg -name $name1 -location $loc -DefinitionFilePath $fileName1 -ParameterFilePath $parm1
$a.Parameters.Count
*for Parameters i use this content in a file
{
"$connections": {
"value": {
"office365": {
"connectionId": "/subscriptions/SUBS-DEPLOY/resourceGroups/RG-DEPLOY/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/SUBS-DEPLOY/providers/Microsoft.Web/locations/westeurope/managedApis/office365"
},
"sharepointonline": {
"connectionId": "/subscriptions/SUBS-DEPLOY/resourceGroups/RG-DEPLOY/providers/Microsoft.Web/connections/sharepointonline",
"connectionName": "sharepointonline",
"id": "/subscriptions/SUBS-DEPLOY/providers/Microsoft.Web/locations/westeurope/managedApis/sharepointonline"
}
}
}
}
replace SUBS-DEPLOY with the subscription id and RG-DEPLOY with resource group name and all good.
Anything just buzz: stationsolutions_at_gmail.com
Hope it helps
Here's the code ..
function Get-LogicApp($resourceGroupName ,$location,$name)
{
Write-Host " Get LogicApp Definition $name"
$lapp = Get-AzLogicApp -ResourceGroupName $resourceGroupName -Name $name
$o= $lapp.Definition.ToString()
$fileName = "..\logicapps\" + $name + ".logicapp.json"
$o | Out-File -FilePath $fileName
$parms = "..\logicapps\templates\parms.json"
$fileName = "..\logicapps\" + $name + ".logicapp.parms.json"
Copy-Item -Path $parms $fileName
Write-Host " LogicApp Definition $resourceGroupName > $fileName"
}

How to pass credentials for data sources during tabular model deployment?

Problem :
When I deploy the tabular model using deployment wizard. It works fine. But our problems is that we have 20 data sources and at the time of deployment, we need to provide the credentials 20 times as it asks for credentials for every data source. Which is very painful. That is why we want to automate the deployment.
Approach:
I followed this article https://notesfromthelifeboat.com/post/analysis-services-1-deployment/ and I can able to deploy the tabular model without errors but when I refresh the model. It fails with below error
Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error:
The credentials provided for the File source are invalid. (Source at \\share\acaidatatempshare\data\lumeneventpropertiesexport.tsv.).
OLE DB or ODBC error: The command has been canceled..
OLE DB or ODBC error: The command has been canceled..
OLE DB or ODBC error: The command has been canceled..
My data source is tsv file and Below is the data source section of the model.bim file. As you can see it does not save the password for the crendential in the model.bim, asdatabase or xmla file.
….
….
{
"type": "structured",
"name": "File/\\\\Share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport tsv",
"connectionDetails": {
"protocol": "file",
"address": {
"path": "\\\\share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport.tsv"
},
"authentication": null,
"query": null
},
"credential": {
"AuthenticationKind": "Windows",
"kind": "File",
"path": "\\\\Share\\acaidatatempshare\\data\\lumeneventpropertiesexport.tsv",
"Username": "domain\\username"
},
"contextExpression": [
"let",
" #\"0001\" = Csv.Document(..., [Delimiter = \"#(tab)\", Columns = 3, Encoding = 1252, QuoteStyle = QuoteStyle.None]),",
" #\"0002\" = Table.TransformColumnTypes(#\"0001\", {{\"Column1\", type text}, {\"Column2\", type text}, {\"Column3\", type text}})",
"in",
" #\"0002\""
]
},
…..
…..
How can I pass credentials for data sources programmatically during deployment?
Unfortunately, structured (aka. Power Query) data source credentials are not persisted when you deploy the model. I reported this as a bug with the product team some time ago, but have not gotten a response yet. If you can, consider using the legacy (aka. Provider) data sources instead, as these keep the credentials between deployments.
Alternatively, you can apply a password programmatically using a TMSL "createOrReplace" script. The easiest way to create such as script, is to connect to Analysis Services within SSMS, right-click the connection (aka. data source), and choose "Script Connection as" > "CREATE OR REPLACE To" > "New Query Editor Window". In the resulting script, make sure that the password is set correctly:
{
"createOrReplace": {
"object": {
"database": [...] ,
"dataSource": "File/\\\\Share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport tsv"
},
"dataSource": {
[...]
"credential": {
"AuthenticationKind": "Windows",
"kind": "File",
"path": "\\\\Share\\acaidatatempshare\\data\\lumeneventpropertiesexport.tsv",
"Username": "domain\\username",
"Password": "<<< YOUR PASSWORD HERE >>>"
},
[...]
}
You can then invoke this script as part of your deployment pipeline - for example using the PowerShell Invoke-AsCmd cmdlet.
Here is the final script I ended up creating.
# Get tools path
$msBuildPath = Get-MSBuildToPath
$Microsoft_AnalysisServices_Deployment_Exe_Path = Get-Microsoft_AnalysisServices_Deployment_Exe_Path
# BUild smproj
& $msBuildPath $projPath "/p:Configuration=validation" /t:Build
Get-ChildItem $binPath | Copy -Destination $workingFolder -Recurse
$secureStringRecreated = ConvertTo-SecureString -String $AnalysisServerPassword -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($AnalysisServerUserName, $secureStringRecreated)
#$plainText = $cred.GetNetworkCredential().Password
#region begin Update Model.deploymenttargets
# Read Model.deploymenttargets
[xml]$deploymenttargets = Get-Content -Path $deploymenttargetsFilePath
$deploymenttargets.DeploymentTarget.Database = $AnalysisDatabase
$deploymenttargets.DeploymentTarget.Server = $AnalysisServer
$deploymenttargets.DeploymentTarget.ConnectionString = "DataSource=$AnalysisServer;Timeout=0;UID=$AnalysisServerUserName;Password=$AnalysisServerPassword;"
$deploymenttargets.Save($deploymenttargetsFilePath);
#endregion
#region begin Update Model.deploymentoptions
# Read Model.deploymentoptions
[xml]$deploymentoptions = Get-Content -Path $deploymentoptionsFilePath
# Update ProcessingOption to DoNotProcess otherwise correct xmla file wont be generated.
$deploymentoptions.Deploymentoptions.ProcessingOption = 'DoNotProcess'
$deploymentoptions.Deploymentoptions.TransactionalDeployment = 'false'
$deploymentoptions.Save($deploymentoptionsFilePath);
#endregion
# Create xmla deployment file.
& $Microsoft_AnalysisServices_Deployment_Exe_Path $asdatabaseFilePath /s:$logFilePath /o:$xmlaFilePath
#region begin Update .xmla
#Add passowrd in .xmla file.
$xmladata = Get-Content -Path $xmlaFilePath | ConvertFrom-Json
foreach ($ds in $xmladata.createOrReplace.database.model.dataSources){
$ds.Credential.AuthenticationKind = 'Windows'
$ds.Credential.Username = $AnalysisServerUserName
#Add password property to the object.
$ds.credential | Add-Member -NotePropertyName Password -NotePropertyValue $AnalysisServerPassword
}
$xmladata | ConvertTo-Json -depth 100 | Out-File $xmlaFilePath
#endregion
#Deploy model xmla.
Invoke-ASCmd -InputFile $xmlaFilePath -Server $AnalysisServer -Credential $cred`enter code here`

Powershell array issue

I am experiencing an issue with what should be a very simple task, but for some reason is not working as expected.
I am running this code via the Powershell ISE on a Windows 10 PC with Powershell v5.
GOAL: Create an array of JSON files with the intent of assigning specific values from the JSON data to Powershell variables which will then be fed into an Exchange online function to create thousands of new Office 365 groups.
ISSUE: While values appear to be correctly populating each array, certain variables from the array are being concatenated. See specific errors below.
SAMPLE CODE:
Here is a sample JSON file (note: I am only using a very limited subset of the data in each file):
{
"Alias": "testmigrationlist7",
"DisplayName": "Test Migration List 7",
"IsHiddenFromAddressList": true,
"EmailAddresses": [
{
"Action":"Add",
"Value": "testmigrationlist7#testlab.local",
"AddressPrimary": true,
"AddressProtocol": "SMTP"
}
],
"Members": {
"Recipients": [
{
"Action":"Add",
"Value":"testuser1"
},
{
"Action":"Add",
"Value":"testuser2"
}
]
},
"AcceptMessagesOnlyFrom": {
"All":"restricted",
"Recipients": [
{
"Action":"Remove",
"Value":"testuser1"
},
{
"Action":"Add",
"Value":"testuser2"
}
]
}
}
Get content of all JSON files:
$allObjects = #(Get-ChildItem -path c:\tmp\json\*.json | Get-Content -Raw | ConvertFrom-Json)
If I then test the above array, it appears to output as expected:
$allObjects.displayname
Test Migration List 7
Test Migration List 8
$allObjects.alias
testmigrationlist7
testmigrationlist8
Now the code that takes the above data and loops through the array:
function import-UnixDL2Group {
New-UnifiedGroup -Alias $allobjects.alias -displayname `
$allobjects.displayname -Owner testowner1 -Members `
$allobjects.members.recipients.value `
-emailaddresses $allobjects.emailaddresses.value
}
foreach($_ in $allObjects.alias){import-UnixDL2Group}
The above outputs the following error and stops:
Cannot bind parameter 'Alias' to the target. Exception setting "Alias": "Property expression "testmigrationlist7 testmigrationlist8" isn't valid....."
Notice how it tries to use both aliases with a space for one alias:
"testmigrationlist7 testmigrationlist8"
The same occurs with DisplayName.
If I test with only 1 JSON file, it works correctly:
$JSONinput = (get-content -path c:\tmp\json\test1.json -raw) | ConvertFrom-Json
function import-UnixDL2GroupTest {
New-UnifiedGroup -Alias $JSONinput.alias -displayname $JSONinput.displayname `
-Owner testowner1 -Members $JSONinput.members.recipients.value `
-emailaddresses $JSONinput.emailaddresses.value
}
$JSONinput | import-UnixDL2GroupTest
I am sure I am overlooking something very simple, but the answer eludes me at the moment. Any guidance would be greatly appreciated.
Thank you in advance for your consideration.
UPDATE: I have also tried defining a simple array to take the JSON data out of the picture, but I get the same error, so it must be the foreach loop.
$manualArray = #("testmigrationlist7","testmigrationlist8")
function import-Unix2GroupManual {
New-UnifiedGroup -Alias $manualArray -displayname $manualArray `
-Members testuser1,testuser2
}
foreach($_ in $manualArray){import-Unix2GroupManual}
Your code tries to invoke New-UnifiedGroup with all aliases at once (since you're using the global variable $allobjects inside the function), which doesn't work. Also, $_ is an automatic variable that holds the current object in a pipeline. Overriding automatic variables for your own purposes is generally not recommended, as it tends to lead to ... interesting side effects.
Parametrize your function like this:
function Import-UnixDL2Group {
[CmdletBinding()]
Param(
[Parameter(Position=0, Mandatory=$true, ValueFromPipeline=$true)]
$InputObject,
[Parameter(Position=1, Mandatory=$false)]
[String]$Owner = 'testowner1'
)
Process {
New-UnifiedGroup -Alias $InputObject.alias `
-DisplayName $InputObject.displayname `
-Owner $Owner `
-Members $InputObject.members.recipients.value `
-EmailAddresses $InputObject.emailaddresses.value
}
}
and invoke it like this:
$allObjects | Import-UnixDL2Group
and the problem should disappear.

Get Function & Host Keys of Azure Function In Powershell

I have deployed Azure function using Arm template. I need the Function Key & host Key of deployed Azure Function in Powershell.
Currently I am trying to get the keys From Output Section of ARM template
"outputs": {
"FunctionAppName": {
"type": "string",
"value": "[variables('functionAppName')]"
},
"Key": {
"type": "string",
"value": "[listKeys(resourceId('Microsoft.Web/sites', '[variables('functionAppName')]'),'2015-08-01').keys]"
}
}
I tried different combinations But it failing.
Is there any way to retrieve keys in Powershell?
I got it working by using the following:
"outputs": {
"FunctionAppName": {
"type": "string",
"value": "[parameters('functionName')]"
},
"Key": {
"type": "string",
"value": "[listsecrets(resourceId('Microsoft.Web/sites/functions', parameters('existingFunctionAppName'), parameters('functionName')),'2015-08-01').key]"
},
"Url": {
"type": "string",
"value": "[listsecrets(resourceId('Microsoft.Web/sites/functions', parameters('existingFunctionAppName'), parameters('functionName')),'2015-08-01').trigger_url]"
}
}
I couldn't find any examples either.
But by using the above, a quickstart sample at GitHub and the documentation of resource functions along with a some trial and error, I got it to out.
Please note the variables/parameters and names have been changed.
I could not get the accepted answer to work to retrieve the default host key. #4c74356b41's answer is very close. You can get the keys out using the code below. The default host key will be in Outputs.functionKeys.Value.functionKeys.default.Value.
"outputs": {
"functionKeys": {
"type": "object",
"value": "[listkeys(concat(resourceId('Microsoft.Web/sites', variables('functionAppName')), '/host/default'), '2018-11-01')]"
}
}
Question doesn't seem to be answered as it was requesting to get the Function key from Powershell and not ARM templates.
I'm using the script below to get the function key from Powershell in Azure DevOps.
$accountInfo = az account show
$accountInfoObject = $accountInfo | ConvertFrom-Json
$subscriptionId = $accountInfoObject.id
$resourceGroup = "your-resource-group"
$functionName = "your-function-name"
$functionkeylist = az rest --method post --uri "https://management.azure.com/subscriptions/$subscriptionId/resourceGroups/$resourceGroup/providers/Microsoft.Web/sites/$functionName/host/default/listKeys?api-version=2018-11-01"
$keylistobject = $functionkeylist | ConvertFrom-Json
$functionKey = $keylistobject.functionKeys.default
Hope this helps.
Using Azure PowerShell
I'd like to offer a another way to solve this, using an as close to pure Azure PowerShell as I could manage to find. It still relies on composing an Azure "Operation" but can be done in only a few lines of code.
Note: This assumes that you have a PowerShell session that is already authenticated. If you do not see: Connect-AzAccount for more information.
Option 1 - Retrieve key for function app for use with all functions
This example is based on this operation: Web Apps - List Host Keys and using this PowerShell cmdlet to execute the operation: Invoke-AzRestMethod
## lookup the resource id for your Azure Function App ##
$resourceId = (Get-AzResource -ResourceGroupName $rg -ResourceName $functionAppName -ResourceType "Microsoft.Web/sites").ResourceId
## compose the operation path for listing keys ##
$path = "$resourceId/host/default/listkeys?api-version=2021-02-01"
## invoke the operation ##
$result = Invoke-AzRestMethod -Path $urlPath -Method POST
if($result -and $result.StatusCode -eq 200)
{
## Retrieve result from Content body as a JSON object ##
$contentBody = $result.Content | ConvertFrom-Json
## Output the default function key. In reality you would do something more ##
## meaningful with this ##
Write-Host $contentBody.functionKeys.default
}
Option 2 - Retrieve a key for a specific function
This example is based on this operation to retrieve a key specific to the function. This is generally better practice so that you don't have a one-key for all functions. But there are valid reasons why you might want either. See this operation here: Web Apps - List Function Keys
## Lookup function name here ##
$functionName = "MyFunction"
## lookup the resource id for your Azure Function App ##
$resourceId = (Get-AzResource -ResourceGroupName $rg -ResourceName $functionAppName -ResourceType "Microsoft.Web/sites").ResourceId
## compose the operation path for listing keys ##
$path = "$resourceId/functions/$functionName/listkeys?api-version=2021-02-01"
## invoke the operation ##
$result = Invoke-AzRestMethod -Path $urlPath -Method POST
if($result -and $result.StatusCode -eq 200)
{
## Retrieve result from Content body as a JSON object ##
$contentBody = $result.Content | ConvertFrom-Json
## Output the default function key. In reality you would do something more ##
## meaningful with this. ##
Write-Host $contentBody.default
}
First of all, you have an error in your syntax:
"value": "[listKeys(resourceId('Microsoft.Web/sites', variables('functionAppName')),'2015-08-01').keys]"
but that won't help, I don't think its implemented for Azure Functions, I'm not 100% sure on this, but my efforts to retrive the keys were futile
So to get this working for the function specific key for MyHttpFunction in the app MyFunctionApp, I had to use the following in the Outputs section of the ARM template:
"MyHttpFunctionKey": {
"type": "string",
"value": "[listkeys(resourceId('Microsoft.Web/sites/functions', 'MyFunctionApp', 'MyHttpFunction'), '2019-08-01').default]"
}
If that is called from Powershell using New-AzResourceGroupDeployment with a parameter -OutVariable arm then the following Powershell command will print the key: $arm.Outputs.myHttpFunctionKey.Value
Following code will get exact key in string format i used this key for availability test creation.
"outputs": {
"Key":{
"type": "string",
"value": "[listkeys(concat(resourceId('Microsoft.Web/sites', 'functionAppName'), '/functions', '/FunctionName'), '2018-11-01').default]"
}
}

Azure DSC. HA Active Directory Domain Controller issue with Windows Server 2016

I'm trying to modify the official HA DC example to work with Windows Server 2016. https://github.com/Azure/azure-quickstart-templates/tree/master/active-directory-new-domain-ha-2-dc
After updating xActiveDirectory module that addresses race condition on Windows Server 2016 it gives me one more error. The final script that resides in ConfigureADBDC.ps1 fails:
Script script1
{
SetScript =
{
$dnsFwdRule = Get-DnsServerForwarder
if ($dnsFwdRule)
{
Remove-DnsServerForwarder -IPAddress $dnsFwdRule.IPAddress -Force
}
Write-Verbose -Verbose "Removing DNS forwarding rule"
}
GetScript = { #{} }
TestScript = { $false}
DependsOn = "[xADDomainController]BDC"
PowerShell DSC resource MSFT_ScriptResource failed to execute Set-TargetResource functionality with error message: Failed to get information for server ADBDC.
When I execute Get-DnsServerForwarder I see this:
PS C:\Users\adAdministrator> Get-DnsServerForwarder
UseRootHint : True
Timeout(s) : 3
EnableReordering : True
IPAddress :
ReorderedIPAddress :
However after some time it changes to this:
PS C:\Users\adAdministrator> Get-DnsServerForwarder
UseRootHint : True
Timeout(s) : 3
EnableReordering : True
IPAddress : 10.0.0.4
ReorderedIPAddress : 10.0.0.4
So, my question is. What is that DnsServerForwarder is used for? Is that even needed? How is it possible to fix this issue?
Well, a hackish way would be:
SetScript = {
do {
$dnsFwdRule = Get-DnsServerForwarder
} while ( $dnsFwdRule.IPAddress -eq $null )
if( $dnsFwdRule ) {
Remove-DnsServerForwarder -IPAddress $dnsFwdRule.IPAddress -Force
}
Write-Verbose -Verbose "Removing DNS forwarding rule"
}
note, this could lead to an infinite loop ;) you can fix that with adding something like this:
$i = 0
do
{
$i++
Start-Sleep 10
$dnsFwdRule = Get-DnsServerForwarder
}
while ($i -lt 10 -and $dnsFwdRule.IPAddress -eq $null)
as for the first question:
The Get-DnsServerForwarder cmdlet gets configuration settings on a DNS server. A forwarder is a Domain Name System (DNS) server on a network that is used to forward DNS queries for external DNS names to DNS servers outside that network.