Get Function & Host Keys of Azure Function In Powershell - powershell

I have deployed Azure function using Arm template. I need the Function Key & host Key of deployed Azure Function in Powershell.
Currently I am trying to get the keys From Output Section of ARM template
"outputs": {
"FunctionAppName": {
"type": "string",
"value": "[variables('functionAppName')]"
},
"Key": {
"type": "string",
"value": "[listKeys(resourceId('Microsoft.Web/sites', '[variables('functionAppName')]'),'2015-08-01').keys]"
}
}
I tried different combinations But it failing.
Is there any way to retrieve keys in Powershell?

I got it working by using the following:
"outputs": {
"FunctionAppName": {
"type": "string",
"value": "[parameters('functionName')]"
},
"Key": {
"type": "string",
"value": "[listsecrets(resourceId('Microsoft.Web/sites/functions', parameters('existingFunctionAppName'), parameters('functionName')),'2015-08-01').key]"
},
"Url": {
"type": "string",
"value": "[listsecrets(resourceId('Microsoft.Web/sites/functions', parameters('existingFunctionAppName'), parameters('functionName')),'2015-08-01').trigger_url]"
}
}
I couldn't find any examples either.
But by using the above, a quickstart sample at GitHub and the documentation of resource functions along with a some trial and error, I got it to out.
Please note the variables/parameters and names have been changed.

I could not get the accepted answer to work to retrieve the default host key. #4c74356b41's answer is very close. You can get the keys out using the code below. The default host key will be in Outputs.functionKeys.Value.functionKeys.default.Value.
"outputs": {
"functionKeys": {
"type": "object",
"value": "[listkeys(concat(resourceId('Microsoft.Web/sites', variables('functionAppName')), '/host/default'), '2018-11-01')]"
}
}

Question doesn't seem to be answered as it was requesting to get the Function key from Powershell and not ARM templates.
I'm using the script below to get the function key from Powershell in Azure DevOps.
$accountInfo = az account show
$accountInfoObject = $accountInfo | ConvertFrom-Json
$subscriptionId = $accountInfoObject.id
$resourceGroup = "your-resource-group"
$functionName = "your-function-name"
$functionkeylist = az rest --method post --uri "https://management.azure.com/subscriptions/$subscriptionId/resourceGroups/$resourceGroup/providers/Microsoft.Web/sites/$functionName/host/default/listKeys?api-version=2018-11-01"
$keylistobject = $functionkeylist | ConvertFrom-Json
$functionKey = $keylistobject.functionKeys.default
Hope this helps.

Using Azure PowerShell
I'd like to offer a another way to solve this, using an as close to pure Azure PowerShell as I could manage to find. It still relies on composing an Azure "Operation" but can be done in only a few lines of code.
Note: This assumes that you have a PowerShell session that is already authenticated. If you do not see: Connect-AzAccount for more information.
Option 1 - Retrieve key for function app for use with all functions
This example is based on this operation: Web Apps - List Host Keys and using this PowerShell cmdlet to execute the operation: Invoke-AzRestMethod
## lookup the resource id for your Azure Function App ##
$resourceId = (Get-AzResource -ResourceGroupName $rg -ResourceName $functionAppName -ResourceType "Microsoft.Web/sites").ResourceId
## compose the operation path for listing keys ##
$path = "$resourceId/host/default/listkeys?api-version=2021-02-01"
## invoke the operation ##
$result = Invoke-AzRestMethod -Path $urlPath -Method POST
if($result -and $result.StatusCode -eq 200)
{
## Retrieve result from Content body as a JSON object ##
$contentBody = $result.Content | ConvertFrom-Json
## Output the default function key. In reality you would do something more ##
## meaningful with this ##
Write-Host $contentBody.functionKeys.default
}
Option 2 - Retrieve a key for a specific function
This example is based on this operation to retrieve a key specific to the function. This is generally better practice so that you don't have a one-key for all functions. But there are valid reasons why you might want either. See this operation here: Web Apps - List Function Keys
## Lookup function name here ##
$functionName = "MyFunction"
## lookup the resource id for your Azure Function App ##
$resourceId = (Get-AzResource -ResourceGroupName $rg -ResourceName $functionAppName -ResourceType "Microsoft.Web/sites").ResourceId
## compose the operation path for listing keys ##
$path = "$resourceId/functions/$functionName/listkeys?api-version=2021-02-01"
## invoke the operation ##
$result = Invoke-AzRestMethod -Path $urlPath -Method POST
if($result -and $result.StatusCode -eq 200)
{
## Retrieve result from Content body as a JSON object ##
$contentBody = $result.Content | ConvertFrom-Json
## Output the default function key. In reality you would do something more ##
## meaningful with this. ##
Write-Host $contentBody.default
}

First of all, you have an error in your syntax:
"value": "[listKeys(resourceId('Microsoft.Web/sites', variables('functionAppName')),'2015-08-01').keys]"
but that won't help, I don't think its implemented for Azure Functions, I'm not 100% sure on this, but my efforts to retrive the keys were futile

So to get this working for the function specific key for MyHttpFunction in the app MyFunctionApp, I had to use the following in the Outputs section of the ARM template:
"MyHttpFunctionKey": {
"type": "string",
"value": "[listkeys(resourceId('Microsoft.Web/sites/functions', 'MyFunctionApp', 'MyHttpFunction'), '2019-08-01').default]"
}
If that is called from Powershell using New-AzResourceGroupDeployment with a parameter -OutVariable arm then the following Powershell command will print the key: $arm.Outputs.myHttpFunctionKey.Value

Following code will get exact key in string format i used this key for availability test creation.
"outputs": {
"Key":{
"type": "string",
"value": "[listkeys(concat(resourceId('Microsoft.Web/sites', 'functionAppName'), '/functions', '/FunctionName'), '2018-11-01').default]"
}
}

Related

How can I get the workflow definition of a logic app as JSON?

How can I get the workflow definition of a logic app as JSON?
I can create a new Logic App from a JSON definition file using the New-AzLogicApp command
But I can't see how to reverse the process, i.e. get the JSON definition of an existing Logic App.
I've tried the Get-AzLogicApp which returns a Workflow object.
But I'm blocked on the last step, i.e. from the Workflow back to an actual JSON file
If you want to get the definition of the logic app, try the command as below.
$logicapp = Get-AzResource -ResourceGroupName <ResourceGroupName> -ResourceType Microsoft.Logic/workflows -ResourceName "<logic app name>"
$logicapp.properties.definition | ConvertTo-Json
If you want to get it as a .json file, just change the second line as below.
$logicapp.properties.definition | ConvertTo-Json | Out-File "C:\Users\joyw\Desktop\logic.json"
Update:
You could specify the -Depth parameter of ConvertTo-Json with 3, if you want more levels of contained objects are included in the JSON representation, you can also specify it with other values.
-Depth
Specifies how many levels of contained objects are included in the JSON representation. The default value is 2.
$logicapp.properties.definition | ConvertTo-Json -Depth 3
You can use the REST API to get the details.
REST Api Documentation
Or can you try using the Get-AzLogicApp | ConvertFrom-Json | ConvertTo-Json and see if that helps
I've drilled down for a project in a client u need to do this:
1 )Get-AzLogicApp
$lapp.Definition.ToString()-> this is the entire definition of the logicapp
2) Save the definition to a file
3) Use New-AzLogicApp or Set-AzLogicApp with -DefinitionFilePath pointing to that file
$a= New-AzLogicApp -ResourceGroupName $rg -name $name1 -location $loc -DefinitionFilePath $fileName1 -ParameterFilePath $parm1
$a.Parameters.Count
*for Parameters i use this content in a file
{
"$connections": {
"value": {
"office365": {
"connectionId": "/subscriptions/SUBS-DEPLOY/resourceGroups/RG-DEPLOY/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/SUBS-DEPLOY/providers/Microsoft.Web/locations/westeurope/managedApis/office365"
},
"sharepointonline": {
"connectionId": "/subscriptions/SUBS-DEPLOY/resourceGroups/RG-DEPLOY/providers/Microsoft.Web/connections/sharepointonline",
"connectionName": "sharepointonline",
"id": "/subscriptions/SUBS-DEPLOY/providers/Microsoft.Web/locations/westeurope/managedApis/sharepointonline"
}
}
}
}
replace SUBS-DEPLOY with the subscription id and RG-DEPLOY with resource group name and all good.
Anything just buzz: stationsolutions_at_gmail.com
Hope it helps
Here's the code ..
function Get-LogicApp($resourceGroupName ,$location,$name)
{
Write-Host " Get LogicApp Definition $name"
$lapp = Get-AzLogicApp -ResourceGroupName $resourceGroupName -Name $name
$o= $lapp.Definition.ToString()
$fileName = "..\logicapps\" + $name + ".logicapp.json"
$o | Out-File -FilePath $fileName
$parms = "..\logicapps\templates\parms.json"
$fileName = "..\logicapps\" + $name + ".logicapp.parms.json"
Copy-Item -Path $parms $fileName
Write-Host " LogicApp Definition $resourceGroupName > $fileName"
}

Concatenate CSV results into a single variable with PowerShell

I'm trying to deploy our company URLs in Google Chrome as "Managed Bookmarks" using PowerShell. In order to accomplish the same thing using IE favorites I created a CSV file with 2 columns containing a Name and URL respectively. I import that CSV then I have a foreach statement that will go through each line and create a .url file in the user's favorites. To minimize effort for my staff I want to use this same CSV file for the Chrome Bookmarks so we only have one file to maintain. Then it wouldn't be necessary for us to modify and redeploy the script, and we could publish the CSV file on a network share to be updated as needed.
Chrome has a registry value that allows me to do what I need. Using 3 search engines as an example, I know how to "hard code" this and make it work.
$PolicyPath = 'Registry::HKEY_LOCAL_MACHINE\Software\Policies'
$GoogleKey = 'Google'
$ChromeKey = 'Chrome'
$ManagedBookmarks = '[ { "name": "Bing", "url": "http://www.bing.com" }, { "name": "Google", "url": "http://www.google.com" },{ "name": "Yahoo", "url": "http://www.yahoo.com" } ]'
Set-ItemProperty -Path "$($PolicyPath)\$($GoogleKey)\$($ChromeKey)" -Name 'ManagedBookmarks' -Value "$ManagedBookmarks" -Type String
Is there a way to do a foreach and concatenate the results into a single variable? That will result in the following format:
$ManagedBookmarks = "[ { "name:" "$($line.name)", "url": "$($line.url)"}, { "name:" "$($line+n.name)", "url": "$($line+n.url)"} ]"
If you have a CSV (call it csv.csv) file like the following, you can just import the CSV to create an object array and then convert the whole thing to a JSON object.
Name,URL
Bing,http://www.bing.com
Google,http://www.google.com
Yahoo,http://www.yahoo.com
$ManagedBookmarks = Import-Csv csv.csv | ConvertTo-Json
Per LotPings Recommendation, if you dislike the line feeds and/or carriage returns and extra spacing in that output, you can use the -Compress switch.
$ManagedBookmarks = Import-Csv csv.csv | ConvertTo-Json -Compress

How to pass credentials for data sources during tabular model deployment?

Problem :
When I deploy the tabular model using deployment wizard. It works fine. But our problems is that we have 20 data sources and at the time of deployment, we need to provide the credentials 20 times as it asks for credentials for every data source. Which is very painful. That is why we want to automate the deployment.
Approach:
I followed this article https://notesfromthelifeboat.com/post/analysis-services-1-deployment/ and I can able to deploy the tabular model without errors but when I refresh the model. It fails with below error
Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error:
The credentials provided for the File source are invalid. (Source at \\share\acaidatatempshare\data\lumeneventpropertiesexport.tsv.).
OLE DB or ODBC error: The command has been canceled..
OLE DB or ODBC error: The command has been canceled..
OLE DB or ODBC error: The command has been canceled..
My data source is tsv file and Below is the data source section of the model.bim file. As you can see it does not save the password for the crendential in the model.bim, asdatabase or xmla file.
….
….
{
"type": "structured",
"name": "File/\\\\Share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport tsv",
"connectionDetails": {
"protocol": "file",
"address": {
"path": "\\\\share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport.tsv"
},
"authentication": null,
"query": null
},
"credential": {
"AuthenticationKind": "Windows",
"kind": "File",
"path": "\\\\Share\\acaidatatempshare\\data\\lumeneventpropertiesexport.tsv",
"Username": "domain\\username"
},
"contextExpression": [
"let",
" #\"0001\" = Csv.Document(..., [Delimiter = \"#(tab)\", Columns = 3, Encoding = 1252, QuoteStyle = QuoteStyle.None]),",
" #\"0002\" = Table.TransformColumnTypes(#\"0001\", {{\"Column1\", type text}, {\"Column2\", type text}, {\"Column3\", type text}})",
"in",
" #\"0002\""
]
},
…..
…..
How can I pass credentials for data sources programmatically during deployment?
Unfortunately, structured (aka. Power Query) data source credentials are not persisted when you deploy the model. I reported this as a bug with the product team some time ago, but have not gotten a response yet. If you can, consider using the legacy (aka. Provider) data sources instead, as these keep the credentials between deployments.
Alternatively, you can apply a password programmatically using a TMSL "createOrReplace" script. The easiest way to create such as script, is to connect to Analysis Services within SSMS, right-click the connection (aka. data source), and choose "Script Connection as" > "CREATE OR REPLACE To" > "New Query Editor Window". In the resulting script, make sure that the password is set correctly:
{
"createOrReplace": {
"object": {
"database": [...] ,
"dataSource": "File/\\\\Share\\AcaiDataTempShare\\Data\\LumenEventPropertiesExport tsv"
},
"dataSource": {
[...]
"credential": {
"AuthenticationKind": "Windows",
"kind": "File",
"path": "\\\\Share\\acaidatatempshare\\data\\lumeneventpropertiesexport.tsv",
"Username": "domain\\username",
"Password": "<<< YOUR PASSWORD HERE >>>"
},
[...]
}
You can then invoke this script as part of your deployment pipeline - for example using the PowerShell Invoke-AsCmd cmdlet.
Here is the final script I ended up creating.
# Get tools path
$msBuildPath = Get-MSBuildToPath
$Microsoft_AnalysisServices_Deployment_Exe_Path = Get-Microsoft_AnalysisServices_Deployment_Exe_Path
# BUild smproj
& $msBuildPath $projPath "/p:Configuration=validation" /t:Build
Get-ChildItem $binPath | Copy -Destination $workingFolder -Recurse
$secureStringRecreated = ConvertTo-SecureString -String $AnalysisServerPassword -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($AnalysisServerUserName, $secureStringRecreated)
#$plainText = $cred.GetNetworkCredential().Password
#region begin Update Model.deploymenttargets
# Read Model.deploymenttargets
[xml]$deploymenttargets = Get-Content -Path $deploymenttargetsFilePath
$deploymenttargets.DeploymentTarget.Database = $AnalysisDatabase
$deploymenttargets.DeploymentTarget.Server = $AnalysisServer
$deploymenttargets.DeploymentTarget.ConnectionString = "DataSource=$AnalysisServer;Timeout=0;UID=$AnalysisServerUserName;Password=$AnalysisServerPassword;"
$deploymenttargets.Save($deploymenttargetsFilePath);
#endregion
#region begin Update Model.deploymentoptions
# Read Model.deploymentoptions
[xml]$deploymentoptions = Get-Content -Path $deploymentoptionsFilePath
# Update ProcessingOption to DoNotProcess otherwise correct xmla file wont be generated.
$deploymentoptions.Deploymentoptions.ProcessingOption = 'DoNotProcess'
$deploymentoptions.Deploymentoptions.TransactionalDeployment = 'false'
$deploymentoptions.Save($deploymentoptionsFilePath);
#endregion
# Create xmla deployment file.
& $Microsoft_AnalysisServices_Deployment_Exe_Path $asdatabaseFilePath /s:$logFilePath /o:$xmlaFilePath
#region begin Update .xmla
#Add passowrd in .xmla file.
$xmladata = Get-Content -Path $xmlaFilePath | ConvertFrom-Json
foreach ($ds in $xmladata.createOrReplace.database.model.dataSources){
$ds.Credential.AuthenticationKind = 'Windows'
$ds.Credential.Username = $AnalysisServerUserName
#Add password property to the object.
$ds.credential | Add-Member -NotePropertyName Password -NotePropertyValue $AnalysisServerPassword
}
$xmladata | ConvertTo-Json -depth 100 | Out-File $xmlaFilePath
#endregion
#Deploy model xmla.
Invoke-ASCmd -InputFile $xmlaFilePath -Server $AnalysisServer -Credential $cred`enter code here`

How to read from Azure Table Storage with a HTTP Trigger PowerShell Azure Function?

The Row Key will be passed in the query string. What is needed in the function to create the "connection string" to the Table Storage?
Assume that you already have an app setting in your Function App named AzureWebJobsStorage that has the connection string to your Table Storage, then to retrieve that value in your PowerShell script, you will add the following,
$connectionString = $env:AzureWebJobsStorage;
However, if you need to just write to Table Storage based on the row key, you could leverage the Table Storage binding that is already supported in Azure Functions.
Let's assume that there is a table named testtable is already created in your Table Storage and that is the table we will need to write to. Then, here's a sample setup that reads the row key from query string of an HTTP-trigger and writes an entry to Table Storage.
function.json:
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in",
"authLevel": "anonymous"
},
{
"type": "table",
"name": "outputTable",
"tableName": "testtable",
"connection": "AzureWebJobsStorage",
"direction": "out"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
run.ps1:
# POST method: $req
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
$name = $requestBody.name
# GET method: each querystring parameter is its own variable
if ($req_query_name)
{
$name = $req_query_name
}
Out-File -Encoding Ascii -FilePath $res -inputObject "Hello $name"
Write-Output "Message entity: '$requestBody'"
$entity = [PSObject]#{
PartitionKey = $requestBody.role
RowKey = $req_query_rowkey
AccountId = $requestBody.id
}
$entity | ConvertTo-Json | Out-File -Encoding UTF8 $outputTable
Test in Postman:
Log view:
2017-07-04T17:21:17.095 Function started (Id=775a36ce-9d71-454c-887c-05f08cfdb877)
2017-07-04T17:21:17.314 Message entity: '#{name=Azure; role=admin; id=78910}'
2017-07-04T17:21:17.314 Function completed (Success, Id=775a36ce-9d71-454c-887c-05f08cfdb877, Duration=222ms)
Table entry view in Azure Storage Explorer:

Get-AzureRmRecoveryServicesBackupItem Returns Nothing When Items Exist

I'm experimenting with Azure Resource Manager backups and restores using Recovery Services vaults in "Pay-As-You-Go" subscriptions. I am using very simple code that as far as I can tell is right out of MS documentation. I can get a vault object and I can get a container, but when I attempt to get any items out of any container, Get-AzureRmRecoveryServicesBackupItem always returns nothing, even when I know there are items in there and the Azure RM GUI confirms this. "Returns nothing" means if I assign the result to a variable it is empty/null, and if I try to output to the console there is no output.
I have had the same result with two different subscriptions, with different vaults, vm's and backup items, and from two different computers. I have re-installed modules on one machine and installed modules from scratch on the other. Using the -Debug switch for the Get-AzureRmRecoveryServicesBackupItem command appears to show the items being returned, with an OK result and no errors. Powershell code appears below, and -Debug output appears after that. The kicker is, this exact code was working on 2/7/17, no kidding. I'm about out of ideas at this point so any help would be appreciated.
Update: We have found that when we use the -FriendlyName parameter for the Get-AzureRmRecoveryServicesBackupContainer the problem occurs, but if we pipe the output through | Where-Object { $_.FriendlyName -eq $vmName } the problem does not occur. We have opened a case with MSFT.
$vaultName = 'somevault'
$resourceGroupName = 'somerg'
$vmName = 'testvm'
$vault = Get-AzureRmRecoveryServicesVault -Name $vaultName -ResourceGroupName $resourceGroupName;
Set-AzureRmRecoveryServicesVaultContext -Vault $vault
$container = Get-AzureRmRecoveryServicesBackupContainer -ContainerType AzureVM –Status Registered -FriendlyName $vmName
#output of this cmdlet is empty
Get-AzureRmRecoveryServicesBackupItem –Container $container -Debug
Debug output:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
OK
Headers:
Pragma : no-cache
x-ms-request-id : xxx
x-ms-client-request-id : xxx
Strict-Transport-Security : max-age=31536000; includeSubDomains
x-ms-ratelimit-remaining-subscription-reads: 14930
x-ms-correlation-request-id : xxx
x-ms-routing-request-id : WESTUS:xxx
Cache-Control : no-cache
Date : Sat, 11 Feb 2017 19:39:07 GMT
Server : Microsoft-IIS/8.0
X-Powered-By : ASP.NET
Body:
{
"value": [
{
"id": "/Subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.RecoveryServices/vaults/vault273/backupFabrics/Azure/protectionContainers/IaasVM
Container;iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm/protectedItems/VM;iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"name": "iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers/protectedItems",
"properties": {
"friendlyName": "testvm",
"virtualMachineId": "/subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.Compute/virtualMachines/testvm",
"protectionStatus": "Healthy",
"protectionState": "Protected",
"healthStatus": "Passed",
"healthDetails": [
{
"code": 400239,
"title": "IaasVmHealthGreenDefault",
"message": "Backup pre-check status of this virtual machine is OK.",
"recommendations": []
}
],
"lastBackupStatus": "Completed",
"lastBackupTime": "2017-02-11T15:11:12.2071619Z",
"protectedItemDataId": "70368936803029",
"protectedItemType": "Microsoft.Compute/virtualMachines",
"backupManagementType": "AzureIaasVM",
"workloadType": "VM",
"containerName": "iaasvmcontainerv2;VS-xxxxxxxxxxxxx-Group;testvm",
"sourceResourceId": "/subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.Compute/virtualMachines/testvm",
"policyId": "/Subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx/resourceGroups/VS-xxxxxxxxxxxxx-Group/providers/Microsoft.RecoveryServices/vaults/vault273/backupPolicies/DailyPolicy",
"policyName": "DailyPolicy",
"lastRecoveryPoint": "2017-02-11T15:12:16.7410628Z"
}
}
]
}
DEBUG: AzureQoSEvent: CommandName - Get-AzureRmRecoveryServicesBackupItem; IsSuccess - True; Duration - 00:00:02.3527163; Exception - ;
Had the same issue (also on various PCs) and worked around it by passing a different parameters set. I'm not using the Get-AzureRmRecoveryServicesBackupContainer to get a reference to the container. Instead, I'm directly calling Get-AzureRmRecoveryServicesBackupItem with the container name. This only works when you pass the parameter -BackupManagementType AzureVM.
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionId <myguid>
Get-AzureRmRecoveryServicesVault -Name "myvaultname" | Set-AzureRmRecoveryServicesVaultContext
$item = Get-AzureRmRecoveryServicesBackupItem -BackupManagementType AzureVM -WorkloadType AzureVM -Name "myname"
$latestRecoveryPoint = Get-AzureRmRecoveryServicesBackupRecoveryPoint -Item $item | Sort-Object RecoveryPointTime -Descending | Select-Object -First 1
If you leave out -Name, you get a list returned with all names.