set blob name of azure function with powershell - powershell

I am trying to set (change) the filename of a blob within a Azure function with powershell.
It is working great by function.json
{
"bindings": [
{
"name": "InputBlob",
"type": "blobTrigger",
"direction": "in",
"path": "container-src/{name}.{ext}",
"connection": "stacqadbdev_STORAGE"
},
{
"name": "OutputBlob",
"type": "blob",
"direction": "out",
"path": "container-dest/{name}",
"connection": "stacqadbdev_STORAGE"
}
]
}
which is just 'copying' the blob's name to another container.
As soon as I want to change to destinations blobname to something which is calculated within my function I am failing.
I tried to set
{
"bindings": [
{
"name": "InputBlob",
"type": "blobTrigger",
"direction": "in",
"path": "container-src/{name}",
"connection": "stacqadbdev_STORAGE"
},
{
"name": "OutputBlob",
"type": "blob",
"direction": "out",
"path": "container-dest/{newname}",
"connection": "stacqadbdev_STORAGE"
}
]
}
and assembling $newname in my run.ps1
Doing Push-OutputBinding -Name $OutputBlob -Value $Blob has the issue that it wants to have the Byte[]-Array which has no properties for its name or so.
So, the bindings configuration is just taking the parameters given by input.
Passing something else than the Byte[]-Array is not possible...
That's why I always get
Executed 'Functions.CreateVaultEntry' (Failed, Id=775e61ce-b001-4278-a8d8-1c90ea63c062, Duration=91ms)
System.Private.CoreLib: Exception while executing function: Functions.CreateVaultEntry.
Microsoft.Azure.WebJobs.Host: No value for named parameter 'newfilename'.
I just want to take the inputBlob, change it's name and write it as outputBlob with another name.

What is newname ?
When your function is triggered, and in the trigger, you are binding to {name}, then your other bindings will have knowledge of {name}, but {newname} does not make sense.
Perhaps what you want is container-dest/{name}_output
More info.

Related

Azure Data Factory Copy Data activity - Use variables/expressions in mapping to dynamically select correct incoming column

I have the below mappings for a Copy activity in ADF:
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "$['id']"
},
"sink": {
"name": "TicketID"
}
},
{
"source": {
"path": "$['summary']"
},
"sink": {
"name": "TicketSummary"
}
},
{
"source": {
"path": "$['status']['name']"
},
"sink": {
"name": "TicketStatus"
}
},
{
"source": {
"path": "$['company']['identifier']"
},
"sink": {
"name": "CustomerAccountNumber"
}
},
{
"source": {
"path": "$['company']['name']"
},
"sink": {
"name": "CustomerName"
}
},
{
"source": {
"path": "$['customFields'][74]['value']"
},
"sink": {
"name": "Landlord"
}
},
{
"source": {
"path": "$['customFields'][75]['value']"
},
"sink": {
"name": "Building"
}
}
],
"collectionReference": "",
"mapComplexValuesToString": false
}
The challenge I need to overcome is that the array indexes of the custom fields of the last two sources might change. So I've created an Azure Function which calculates the correct array index. However I can't work out how to use the Azure Function output value in the source path string - I have tried to refer to it using an expression like #activity('Get Building Field Index').output but as it's expecting a JSON path, this doesn't work and produces an error:
JSON path $['customFields'][#activity('Get Building Field Index').outputS]['value'] is invalid.
Is there a different way to achieve what I am trying to do?
Thanks in advance
I have a slightly similar scenario that you might be able to work with.
First, I have a JSON file that is emitted that I then access with Synapse/ADF with Lookup.
I next have a For each activity that runs a copy data activity.
The for each activity receives my Lookup and makes my JSON usable, by setting the following in the For each's Settings like so:
#activity('Lookup').output.firstRow.childItems
My JSON roughly looks as follows:
{"childItems": [
{"subpath": "path/to/folder",
"filename": "filename.parquet",
"subfolder": "subfolder",
"outfolder": "subfolder",
"origin": "A"}]}
So this means in my copy data activity within the for each activity, I can access the parameters of my JSON like so:
#item()['subpath']
#item()['filename']
#item()['folder']
.. etc
Edit:
Adding some screen caps of the parameterization:
https://i.stack.imgur.com/aHpWk.png

How to pass parameter from AWS Step Functions to PowerShell AWS Lambda?

Into simple AWS Lambda PowerShell script I'm passing parameter called tokens in JSON form:
{ "tokens": "ABC123" }
This is being read by Script as variable $LambdaInput.tokens which is expected by Lambda script by design.
Inside Step Function template I have specified Parameter tokens:
{
"Comment": "Start Script",
"StartAt": "PowerShellScript1",
"States": {
"PowerShellScript1": {
"Type": "Task",
"Resource": "arn:aws:states:::lambda:invoke",
"Parameters": {
"FunctionName": "arn:aws:lambda:XYZ:function:PowerShellScript1:$LATEST",
"Payload": {
"Input": {
"tokens": "ABC123"
}
}
},
"End": true,
"TimeoutSeconds": 60
}
}
}
Unfortunately my Lambda script can't recongnize the parameter. I expect it's not being inserted as variable $LambdaInput.tokens.
Is there different Input variable for PowerShell script from Step Functions than from simple Lambda?
Thank you.
Thanks to Joe's comment leading to his answer here I managed to form the appropriate definition of State Machines to pass the Parameter to PowerShell Lambda script:
{
"Comment": "Start Script",
"StartAt": "PowerShellScript1",
"States": {
"PowerShellScript1": {
"Type": "Task",
"Resource": "arn:aws:lambda:XYZ:function:PowerShellScript1:$LATEST",
"Parameters": {
"tokens": "ABC123"
},
"End": true
}
}
}

Deploying an Azure VM and Users with an ARM template and DSC

I am taking my first look at creating a DSC (Desired State Configuration) to go with an ARM (Azure Resource Manager) template to deploy a Windows Server 2016 and additional local user accounts. So far the ARM template works fine and for the DSC file I am using simple example to test functionality. The deployment works fine until I try to pass a username/password so I can create a local Windows user account. I can't seem to make this function work at all (see the error message below).
My question is, how do I use the ARM template to pass the credentials (password) to the DSC (mof) file so that the user can be created without having to explicitly allow plain text passwords (which is not a good practice)?
This is what I have tried:
DSC file
Configuration xUser_CreateUserConfig {
[CmdletBinding()]
Param (
[Parameter(Mandatory = $true)]
[string]
$nodeName,
[Parameter(Mandatory = $true)]
[System.Management.Automation.PSCredential]
[System.Management.Automation.Credential()]
$Credential
)
Import-DscResource -ModuleName xPSDesiredStateConfiguration
Node $nodeName {
xUser 'CreateUserAccount' {
Ensure = 'Present'
UserName = Split-Path -Path $Credential.UserName -Leaf
Password = $Credential
}
}
}
Azure ARM Template Snippet 1st Method
"resources": [
{
"apiVersion": "2016-03-30",
"type": "extensions",
"name": "Microsoft.Powershell.DSC",
"location": "[parameters('location')]",
"tags": {
"DisplayName": "DSC",
"Dept": "[resourceGroup().tags['Dept']]",
"Created By": "[parameters('createdBy')]"
},
"dependsOn": [
"[resourceId('Microsoft.Compute/virtualMachines', concat(variables('vmNamePrefix'), copyIndex(1)))]"
],
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.19",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"modulesUrl": "[concat(variables('_artifactslocation'), '/', variables('dscArchiveFolder'), '/', variables('dscArchiveFileName'))]",
"configurationFunction": "xCreateUserDsc.ps1\\xUser_CreateUserConfig",
"properties": {
"nodeName": "[concat(variables('vmNamePrefix'), copyIndex(1))]",
"Credential": {
"UserName": "[parameters('noneAdminUsername')]",
"Password": "PrivateSettingsRef:UserPassword"
}
}
},
"protectedSettings": {
"Items": {
"UserPassword": "[parameters('noneAdminUserPassword')]"
}
}
}
}
]
Error message
The resource operation completed with terminal provisioning state 'Failed'. VM has reported a failure when processing extension 'Microsoft.Powershell.DSC'. Error message: \\"The DSC Extension received an incorrect input: Compilation errors occurred while processing configuration 'xUser_CreateUserConfig'. Please review the errors reported in error stream and modify your configuration code appropriately. System.InvalidOperationException error processing property 'Password' OF TYPE 'xUser': Converting and storing encrypted passwords as plain text is not recommended. For more information on securing credentials in MOF file, please refer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729
This error message does not help
Azure ARM Template snippet 2nd Method
"resources": [
{
"apiVersion": "2018-10-01",
"type": "extensions",
"name": "Microsoft.Powershell.DSC",
"location": "[parameters('location')]",
"tags": {
"DisplayName": "DSC",
"Dept": "[resourceGroup().tags['Dept']]",
"Created By": "[parameters('createdBy')]"
},
"dependsOn": [
"[resourceId('Microsoft.Compute/virtualMachines', concat(variables('vmNamePrefix'), copyIndex(1)))]"
],
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.9",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"configuration": {
"url": "[concat(variables('_artifactslocation'), '/', variables('dscArchiveFolder'), '/', variables('dscArchiveFileName'))]",
"script": "xCreateUserDsc.ps1",
"function": "xUser_CreateUserConfig"
},
"configurationArguments": {
"nodeName": "[concat(variables('vmNamePrefix'), copyIndex(1))]"
},
"privacy": {
"dataCollection": "Disable"
}
},
"protectedSettings": {
"configurationArguments": {
"Credential": {
"UserName": "[parameters('noneAdminUsername')]",
"Password": "[parameters('noneAdminUserPassword')]"
}
}
}
}
}
]
Error Message
VM has reported a failure when processing extension 'Microsoft.Powershell.DSC'. Error message: "The DSC Extension received an incorrect input: A parameter cannot be found that matches parameter name '$credential.Password'. Another common error is to specify parameters of type PSCredential without an explicit type. Please be sure to use a typed parameter in DSC Configuration, for example: configuration Example param([PSCredential] $UserAccount). Please correct the input and retry executing the extension. More information on troubleshooting is available at https://aka.ms/VMExtensionDSCWindowsTroubleshoot
This does not help!
I have been trying to solve this error for a couple of days. I have Googled for other example but can only find example of people deploying Web Server and Microsoft's documentation is no help because it tells you to use both of the above methods. When method 1 is the old way (according to Microsoft). So, any help will be much appreciated.
this is how I was setting up parameter in the configuration:
# Credentials
[Parameter(Mandatory)]
[System.Management.Automation.PSCredential]$Admincreds,
and then in the template:
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.19",
"autoUpgradeMinorVersion": true,
"settings": {
"configuration": xxx // doesn't matter for this question
"configurationArguments": yyy // doesn't matter for this question
},
"protectedSettings": {
"configurationArguments": {
"adminCreds": {
"userName": "someValue",
"password": "someOtherValue"
}
}
}
}
Links to working stuff:
https://github.com/Cloudneeti/PCI_Reference_Architecture/blob/master/templates/resources/AD/azuredeploy.json#L261
https://github.com/Cloudneeti/PCI_Reference_Architecture/blob/master/artifacts/configurationscripts/ad-domain.ps1#L11
ps. you might also need to do this. Honestly, I dont remember ;)

Using ADF Copy Activity with dynamic schema mapping

I'm trying to drive the columnMapping property from a database configuration table. My first activity in the pipeline pulls in the rows from the config table. My copy activity source is a Json file in Azure blob storage and my sink is an Azure SQL database.
In copy activity I'm setting the mapping using the dynamic content window. The code looks like this:
"translator": {
"value": "#json(activity('Lookup1').output.value[0].ColumnMapping)",
"type": "Expression"
}
My question is, what should the value of activity('Lookup1').output.value[0].ColumnMapping look like?
I've tried several different json formats but the copy activity always seems to ignore it.
For example, I've tried:
{
"type": "TabularTranslator",
"columnMappings": {
"view.url": "url"
}
}
and:
"columnMappings": {
"view.url": "url"
}
and:
{
"view.url": "url"
}
In this example, view.url is the name of the column in the JSON source, and url is the name of the column in my destination table in Azure SQL database.
The issue is due to the dot (.) sign in your column name.
To use column mapping, you should also specify structure in your source and sink dataset.
For your source dataset, you need specify your format correctly. And since your column name has dot, you need specify the json path as following.
You could use ADF UI to setup a copy for a single file first to get the related format, structure and column mapping format. Then change it to lookup.
And as my understanding, your first format should be the right format. If it is already in json format, then you may not need use "json" function in your expression.
There seems to be a disconnect between the question and the answer, so I'll hopefully provide a more straightforward answer.
When setting this up, you should have a source dataset with dynamic mapping. The sink doesn't require one, as we're going to specify it in the mapping.
Within the copy activity, format the dynamic json like the following:
{
"structure": [
{
"name": "Address Number"
},
{
"name": "Payment ID"
},
{
"name": "Document Number"
},
...
...
]
}
You would then specify your dynamic mapping like this:
{
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "Address Number",
"type": "Int32"
},
"sink": {
"name": "address_number"
}
},
{
"source": {
"name": "Payment ID",
"type": "Int64"
},
"sink": {
"name": "payment_id"
}
},
{
"source": {
"name": "Document Number",
"type": "Int32"
},
"sink": {
"name": "document_number"
}
},
...
...
]
}
}
Assuming these were set in separate variables, you would want to send the source as a string, and the mapping as json:
source: #string(json(variables('str_dyn_structure')).structure)
mapping: #json(variables('str_dyn_translator')).translator
VladDrak - You could skip the source dynamic definition by building dynamic mapping like this:
{
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"type": "String",
"ordinal": "1"
},
"sink": {
"name": "dateOfActivity",
"type": "String"
}
},
{
"source": {
"type": "String",
"ordinal": "2"
},
"sink": {
"name": "CampaignID",
"type": "String"
}
}
]
}
}

How to name the file created in blob storage from a Powershell Azure Function?

Here is the function.json I have. The default for the path property is {rand-guid} which works.
I want name the file, I tried {filename} in the json and in the run.ps1 tried setting $env:filename and $filename, it did not work.
Can this be done?
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * 7 * * *"
},
{
"type": "blob",
"name": "outputBlob",
"path": "outcontainer/{filename}",
"connection": "testfunctionpsta0a5_STORAGE",
"direction": "out"
}
],
"disabled": false
}
Unfortunately the output binding is not that simple. We need to pass a json object to the output binding to get the values.
Have a look at the examples on https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#advanced-binding-at-runtime-imperative-binding.
// function.json
{
"name": "$return",
"type": "blob",
"direction": "out",
"path": "output-container/{id}"
}
// C# example: use method return value for output binding
public static string Run(WorkItem input, TraceWriter log)
{
string json = string.Format("{{ \"id\": \"{0}\" }}", input.Id);
log.Info($"C# script processed queue message. Item={json}");
return json;
}
Should be straightforward to do similar in Powershell.
Could you pass the name of your file as part of your trigger payload? If yes, this SO thread may help.